braindump = on
Testing the importer is straightforward. First, make sure your database is up to date and all of your code is checked out, etc. (See TheSchwartz Setup to learn about setting up the TheSchwartz database so that this will work.)
Also make sure that you have enabled beta tools on your server: $ENABLE_BETA_TOOLS = 1;
Step 1: Now go to http//your.site.com/misc/import.bml, and fill out the form, to schedule an import.
You can also schedule an import from the command line:
# schedule an import bin/test/schedule-import -u xb95 -p SOMEPASSWORD -t xb95_on_dw -s livejournal.com
That schedules an import task for user xb95 on site livejournal.com with the given password. It will import the data to the target xb95_on_dw. Huzzah.
Step 2: Now you need to fire up (and keep this running) the scheduler:
# keep scheduler running foreground, and watch the log bin/worker/import-scheduler --foreground & tail -f logs/import-scheduler.log
Note that if you are importing entries, you will need to run this twice -- the first time because it will schedule all the jobs that entries depends on, the second time, to schedule the entry imports. If you are importing comments, you will need to run this thrice: the first time to import tags/friends/etc, the second time to import entries, the third time to import comments.
With that running in one window you should see some noise that says it's scheduling jobs. Woot! That's good. (If you see an error instead that says "worker died: can't insert job (probably duplicate job)", you probably haven't set up your TheSchwartz database.) But now you need to have the jobs that actually get stuff done.
# start TheSchwartz worker manually bin/worker/content-importer --verbose
You will see any noise that happens. Warnings/STDERR output will be dumped to the console. (Which generally means we need to have better debugging output for the importer, sending to STDERR in what is supposed to be a daemon is not really going to be that useful...)
braindump = off