I think I've finally figured out the ffmpeg piece of my automated video encoder project. I cheated and used the Google to find some ffmpeg examples. I created a bash script front end to the command and attached it to a cron job. That works for now but it has some pretty serious problems. Mostly it can't que so if it monitors my downloads folder it just loads up a ton of processes trying to encode all the files it finds. I need to figure out a simple que system so you can set a global limit of encodes. I also want to setup an option to encode only between certain hours.
This could all be done via conf files but I'm thinking if I want an actual GUI for this app I'd rather do it on OSX. I made a quick mockup in Interface Builder for it:
I'll need to brush up on my Objective C before I can start coding anything. I know the basics but I haven't played around with it in probably 2 years. Specifically I need to know how to run external shell processes. The rest is just basic stuff, conditional checking, reading/writing preferences, etc. The other issue is how to include ffmpeg. I don't want the end user to have to install and build it themselves. VisualHub just included ffmpeg inside the app. I guess I'll do that but I'm not sure how to set that type of thing up in XCode.
Future plans might include some very basic distributed processing. Probably just something like computer 1 encodes file A, computer 2 encodes file B, etc. I think that should be pretty easy to setup with XGrid. I'm also interested in automating some meta data stuff. No clue how that would work. I would at least want to set the right type of file for iTunes (Movie/TV) and automatically insert art if a folder.jpg or cover.jpg exists.