A Shell-fish use of AI - Exporting Flickr Zips
Imagine, you decide to backup your photos from a website such as flickr, but you find that it generates over a hundred files. Imagine going through and downloading every file manually. Imagine clicking hyperlinks one hundred and sixty eight times. Imagine having so little pressure on your time.
That’s where a quick ai prompt will get AI to get a shell script to do the work for you. A shell script is a few lines of code that you run via a dot sh file to do a repetitive task, or chore for you.
In my case I decided to download my flickr library once again. I noticed that unlike Google Takeout it does not give me the option of saying “create 50 GB files” so when it output 168 small photo files I had to accept that this is what I have to play with. Some people will say “work” but as it’s about learning I write “play”.
Understanding the URL structure
The beauty of Flickr export files is that they’re sequential so the address is url-n+1.zip. This means that you can easily create an if-else loop that runs through from 1-168, and then stops.
The Euria Summary:
- Prompts you to enter the base URL (e.g., https://example.com file)
- Assumes filenames follow the pattern: <base_url>.zip (e.g., https://example.com/file4.zip)
- Downloads files sequentially (1 file at a time)
- Stops when it fails to find the next file (e.g., 169)
- Creates a download_summary.txt listing all successfully downloaded files
For the sake of this experiment I told Euria that I want a script where I give the URL and I want it to download one file at a time, and when it finds that it has no more files, that it says “task completed” as well as provides a summary text file for me to check.
And Finally
A few days ago I requested data from Google Takeout but when I saw that I had over one hundred files I aborted that export and asked for the 50 GB option and that worked. With Flick I was not given a choice, hence the tool.
AI can provide you with the simple tool you’re looking for without hours of searching.