Difference between revisions of "Importer Testing"

From Dreamwidth Notes
Jump to: navigation, search
m (--verbose is probably better, that way you see what jobs are running too.)
Line 11: Line 11:
  
 
  # keep scheduler running foreground, and watch the log
 
  # keep scheduler running foreground, and watch the log
  bin/import-scheduler --foreground &
+
  bin/worker/import-scheduler --foreground &
 
  tail -f logs/import-scheduler.log
 
  tail -f logs/import-scheduler.log
  

Revision as of 22:17, 15 February 2009

braindump = on

Testing the importer is straightforward. First, make sure your database is up to date and all of your code is checked out, etc. (See TheSchwartz Setup to learn about setting up the TheSchwartz database so that this will work.) Now, try to do this:

# schedule an import
bin/test/schedule-import -u xb95 -p SOMEPASSWORD -t xb95_on_dw -s livejournal.com

That schedules an import task for user xb95 on site livejournal.com with the given password. It will import the data to the target xb95_on_dw. Huzzah.

Now you need to fire up (and keep this running) the scheduler:

# keep scheduler running foreground, and watch the log
bin/worker/import-scheduler --foreground &
tail -f logs/import-scheduler.log

With that running in one window you should see some noise that says it's scheduling jobs. Woot! That's good. (If you see an error instead that says "worker died: can't insert job (probably duplicate job)", you probably haven't set up your TheSchwartz database.) But now you need to have the jobs that actually get stuff done.

# start TheSchwartz worker manually
bin/worker/content-importer --verbose

You will see any noise that happens. Warnings/STDERR output will be dumped to the console. (Which generally means we need to have better debugging output for the importer, sending to STDERR in what is supposed to be a daemon is not really going to be that useful...)

braindump = off