Instructions for automating your file upload in a Linux environment
This guide explains how to upload your Student Information System (SIS) file (a CSV) to uConnect’s secure Amazon S3 bucket — and how to optionally automate uploads on a schedule.
This guide explains how to upload your Student Information System (SIS) file (a CSV) to uConnect’s secure Amazon S3 bucket — and how to optionally automate uploads on a schedule.
You can upload manually using the AWS Command Line Interface (CLI) or FileZilla Pro, or automate it with a cron job (Linux/macOS) or Task Scheduler (Windows).
Tip: Anything shown in ALL_CAPS (like YOUR_BUCKET_NAME) is a placeholder — replace it with your real value.
Prerequisites
Before uploading, confirm you have:
-
Bucket name:
YOUR_BUCKET_NAME(provided by uConnect) -
AWS Access Key ID and AWS Secret Access Key (provided securely by uConnect)
-
Region: use
us-east-1unless told otherwise -
Your file: a CSV named
student_feed.csvthat meets RFC 4180 standards:-
Comma-separated values
-
One row per line
-
Fields containing commas enclosed in double quotes
-
Example:
student_id,first_name,last_name,email,major 12345,Alex,Morgan,alex.morgan@example.edu,Biology 67890,"Taylor, Jr.",Lee,taylor.lee@example.edu,Chemistry
Option A — Windows PowerShell (AWS CLI)
Step 1: Install the AWS CLI
-
Open PowerShell.
-
Check if the AWS CLI is installed:
aws --versionIf not, download AWS CLI v2 for Windows.
Step 2: Configure your credentials
aws configure
Enter:
-
AWS Access Key ID:
YOUR_ACCESS_KEY_ID -
AWS Secret Access Key:
YOUR_SECRET_ACCESS_KEY -
Default region name:
us-east-1 -
Default output format: press Enter to skip
Step 3: Upload your file
aws s3 cp "C:\Data\student_feed.csv" "s3://YOUR_BUCKET_NAME/student_feed.csv" --region us-east-1
Success looks like:
PowerShell prints a line beginning with upload: and ends with your S3 path.
Check the exit code:
$LASTEXITCODE # 0 means success
Step 4: Verify your upload
aws s3 ls "s3://YOUR_BUCKET_NAME/" --region us-east-1
Step 5 (Optional): Automate weekly uploads
You can automate uploads with Task Scheduler.
Set up a weekly task running the same command as above.
Confirm a 0x0 result after it runs (indicating success).
Option B — Linux or macOS Terminal (AWS CLI)
Step 1: Install the AWS CLI
-
Using pip (if Python is installed):
pip install --upgrade --user awscli -
Using bundled installer:
curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip" unzip awscli-bundle.zip sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws aws --version -
macOS via Homebrew:
brew install awscli aws --version
Step 2: Configure your credentials
aws configure # AWS Access Key ID: YOUR_ACCESS_KEY_ID # AWS Secret Access Key: YOUR_SECRET_ACCESS_KEY # Default region name: us-east-1 # Default output format: (press Enter to skip)
Step 3: Upload your file
aws s3 cp /path/to/student_feed.csv s3://YOUR_BUCKET_NAME/student_feed.csv --region us-east-1
Check for a successful transfer:
echo $? # 0 means success
Step 4: Verify your upload
aws s3 ls s3://YOUR_BUCKET_NAME/ --region us-east-1
Automating uploads (Linux/macOS)
After confirming a successful manual upload, automate future uploads using cron.
-
Open your crontab:
crontab -e -
Add a schedule line. Example: upload every Saturday at 8:05 AM:
5 8 * * Sat aws s3 cp /path/to/student_feed.csv s3://YOUR_BUCKET_NAME/student_feed.csv --region us-east-1 >> "$HOME/uconnect_upload.log" 2>&1This saves a log of each run at
~/uconnect_upload.log.
Cron format reference:
+---------------- minute (0 - 59) | +------------- hour (0 - 23) | | +---------- day of month (1 - 31) | | | +------- month (1 - 12) | | | | +---- day of week (0 - 6) (Sunday=0 or 7) | | | | | * * * * * command to be executed
You’re now fully automated! If you’d like, upload a test file and ask your uConnect contact to verify receipt.
Option C — FileZilla Pro (Point-and-Click)
Step 1: Install
Download and install FileZilla Pro (S3 support is only in the Pro version).
Step 2: Connect to your S3 bucket
-
Open File → Site Manager… → New Site.
-
Protocol: Amazon S3
-
Enter your credentials:
-
Access Key ID:
YOUR_ACCESS_KEY_ID -
Secret Access Key:
YOUR_SECRET_ACCESS_KEY -
Region:
us-east-1
-
-
Click Connect and approve any prompts.
Step 3: Upload
-
Open your S3 bucket in the right pane.
-
Locate your local
student_feed.csvon the left. -
Drag it from left to right.
-
If asked, choose Overwrite.
Step 4: Verify
Press F5 to refresh and confirm the file appears in your bucket with a current timestamp.
Troubleshooting
|
Symptom |
Likely cause |
Fix |
|
|
AWS CLI not installed or not on PATH |
Reinstall AWS CLI v2 and restart terminal |
|
|
Typo in keys or wrong region |
Re-run |
|
|
Key lacks bucket permission |
Confirm with uConnect |
|
|
Bucket name mistyped |
Double-check exact name with uConnect |
|
File uploads but doesn’t appear in platform |
File not named |
Rename and re-upload; check RFC 4180 format |
Security Notes
-
Keep your AWS keys private — never email or share them.
-
Rotate credentials if staff changes.
-
Always use the same filename (
student_feed.csv) for consistency.
Need Help?
If you’re missing credentials, bucket details, or encounter errors:
-
Contact your uConnect admin contact, or
-
Submit a support request at support.gouconnect.com including:
-
The command you ran or a screenshot,
-
The time of the attempt, and
-
Any error text.
-
Quick Reference Cheat Sheet
Windows (PowerShell):
aws configure aws s3 cp "C:\Data\student_feed.csv" "s3://YOUR_BUCKET_NAME/student_feed.csv" --region us-east-1 aws s3 ls "s3://YOUR_BUCKET_NAME/" --region us-east-1
Linux/macOS (Terminal):
aws configure aws s3 cp /path/to/student_feed.csv s3://YOUR_BUCKET_NAME/student_feed.csv --region us-east-1 aws s3 ls s3://YOUR_BUCKET_NAME/ --region us-east-1
Automate with cron (Linux/macOS):
5 8 * * Sat aws s3 cp /path/to/student_feed.csv s3://YOUR_BUCKET_NAME/student_feed.csv --region us-east-1
FileZilla Pro:
-
Site Manager → Amazon S3 → enter keys → Connect → drag
student_feed.csvintoYOUR_BUCKET_NAME.
Now, you're all set! If you have any questions, please submit a support request.