Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Jennisunshine

macrumors newbie
Original poster
Jan 11, 2008
12
0
Hi everyone,

This is the first time I'm trying to write an apple script so I was just looking for some pointers.

For one of my classes my professor has posted a lot of past exams on the course website and they're all scans of actual exam papers from the past, they're all named in a way that has an incrementing number at the end, like 2004Apr01.jpg 2004Apr02.jpg and so on. Its been tedius saving them one by one and I thought hey, isn't that what automater is supposed to do for you?

Can some wise soul here please point me in the right direction in creating an automator "action" that let's met enter a starting url like http://www.ubc.ca/2005may01.jpg and have it save that file, then save 2005may02.jpg automatically and keep incrementing?...

Sorry if I am sounding like a dummy, Is this something that can even be done?..

Thanks for your help : )

J
 

lee1210

macrumors 68040
Jan 10, 2005
3,182
3
Dallas, TX
Code:
for i in 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
do echo $i
done | awk '{system("curl http://www.ubc.ca/2005may"$1".jpg > ./Desktop/classfiles/2005may"$1".jpg")}'

I'm not too hot w/ applescript and automator, but this should do it w/ shell. You'll have to create the classfiles directory on the desktop before running this from terminal. I assumed it was days, so did 01-31. There's probably a faster way to do that in shell, but when i treated i as a number i couldn't get leading zeros in a quick way.

-Lee
 

thriftinkid

macrumors regular
Mar 24, 2008
119
0
Hi everyone,

This is the first time I'm trying to write an apple script so I was just looking for some pointers.

For one of my classes my professor has posted a lot of past exams on the course website and they're all scans of actual exam papers from the past, they're all named in a way that has an incrementing number at the end, like 2004Apr01.jpg 2004Apr02.jpg and so on. Its been tedius saving them one by one and I thought hey, isn't that what automater is supposed to do for you?

Can some wise soul here please point me in the right direction in creating an automator "action" that let's met enter a starting url like http://www.ubc.ca/2005may01.jpg and have it save that file, then save 2005may02.jpg automatically and keep incrementing?...

Sorry if I am sounding like a dummy, Is this something that can even be done?..

Thanks for your help : )

J

If all the jpgs are in the same folder, this might help. It takes every file in the folder and adds a prescript, such as a url. You just have to typ in where the folder is, and what the url info should be. It then makes a text file with a list of all the urls. Hope this helps:

paste this in you script editor application:

tell application "Finder"
set theFolder to "Macinosh HD:Users:Desktop:Myfolder" as alias
set theList to name of (every file in theFolder)
end tell

repeat with i from 1 to (number of items in theList)
set theName to (item i of theList)
set item i of theList to ("http://www.mynetwork.tv/myftpfolder/" & theName)
end repeat
set theText to ""
repeat with i from 1 to (number of items in theList)
set theText to (theText & (item i of theList) & "
")
end repeat

tell application "TextEdit"
activate
make new document
set the text of the front document to theText
end tell
 

lee1210

macrumors 68040
Jan 10, 2005
3,182
3
Dallas, TX
If all the jpgs are in the same folder, this might help. It takes every file in the folder and adds a prescript, such as a url. You just have to typ in where the folder is, and what the url info should be. It then makes a text file with a list of all the urls. Hope this helps:

<snip>

I think the OP is looking for the opposite of what you're gunning for. There are a bunch of files on a foreign system that she can access via http. If she made a big list of files in a folder that matched the name of the files on the other system, she could use that script to get the URLs, but she would still need to grab 'em somehow. Thanks for pitching in, but the problem looks a little different this time.

-Lee
 

Jennisunshine

macrumors newbie
Original poster
Jan 11, 2008
12
0
Thanks!

Wow, thank you guys so much for your help! I had thought I'd have to fiddle with this for quite a while to get it to work but lee's solution was only a couple of lines long.

Thanks to both of you again for your efforts and generosity! :)

Jenn
 

hhas

macrumors regular
Oct 15, 2007
126
0
curl allows you to specify ranges, e.g.:

Code:
curl  -o '#1#2.jpg'  'http://www.ubc.ca/2005{Jan,Feb,Mar,Apr,May,Jun,Jul,Aug,Sep,Oct,Nov,Dec}[01-31].jpg'

man curl for more info.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.