Back to contents PHP Python Ruby Choose a language:

Every scraper comes with its own SQLite database which it can store data to. You can also read data from other scrapers.

Saving data, basic

Easy save function for most use.

scraperwiki::save_sqlite(array("a"),array("a"=>1, "bbb"=>"Hi there"));

If the values for the unique_keys matches a record already there, it will over-write

scraperwiki::save_sqlite(array("a"), array("a"=>1, "bbb"=>"Bye there"));

You can add new columns into the database and the table will extend automatically. (The print is so you can see the comment.)

$message = scraperwiki::save_sqlite(array("a"), array("a"=>2, "bbb"=>"Going home", "cccnew"=>-999.9)); print_r($message);

Saving data, advanced

Each new column is given an affinity according to the type of the value it is first given (text, integer, real). It is okay to save a string in a column that was defined as an integer, but it will sometimes be converted if possible. You can define a column with no affinity by giving it the name ending in "_blob".

scraperwiki::save_sqlite(array("a"), array("a"=>1, "dddd_blob"=>"999.999"));

Further parameters in the save function are table_name (the default table name is "swdata"), and verbose (which doesn't send messages to the data tab if set to 0

scraperwiki::save_sqlite($unique_keys, $data, $table_name="swdata", $verbose=2);

You can also list a list of dicts in the save for greater speed

$data = array(array("a"=>10), array("a"=>20), array("a"=>30)); scraperwiki::save_sqlite(array("a"), $data);

Saving data, variables

It's often useful to be able to quickly and easily save one metadata value. For example, to record which page the last run of the scraper managed to get up to. scraperwiki::save_var('last_page', 27); print scraperwiki::get_var('last_page');

It's stored in a simple table called swvariables.

Finding out the schema

To see the dict of table_names mapping to schemas.

print_r(scraperwiki::show_tables());

Info about a particular table (and its elements) can be queried.

$info = scraperwiki::table_info($name="swdata"); foreach ($info as $i=>$column) print_r($column->name +" "+ $column->type);

Direct SQL for saving

You can execute direct SQL commands. Back-ticks ` are used to quote column names that are have spaces in them.

scraperwiki::sqliteexecute("create table ttt (xx int, `yy` string)"); scraperwiki::sqliteexecute("insert into ttt values (?,?)", array(9, 'hello')); scraperwiki::sqliteexecute("insert or replace into ttt values (:xx, :yy)", array("xx"=>10, "yy"=>"again"));

Don't forget after doing your inserts you need to commit the result. (The save() command always automatically commits.)

scraperwiki::sqlitecommit();

Direct SQL for selecting

Selection can be done by execution of a select function.

print_r(scraperwiki::sqliteexecute("select * from ttt")); print_r(scraperwiki::sqliteexecute("select min(xx), yy from ttt group by yy"));

The result will be a dict with a list for keys, and a list of rows (which are lists) for the corresponding values.

{ "keys"=> array("min(xx)", "yy"), data=>array(array(9, 'hello'), array(10, 'again')) }

The shorthand select command gives the results in dicts.

print_r(scraperwiki::select("* from ttt")); --> array(array('yy'=> 'hello', 'xx'=> 9), array('yy'=>'again', 'xx'=>10))

Direct SQL for modifying schemas

You can also clean up by deleting rows or dropping tables

scraperwiki::sqliteexecute("delete from ttt where xx=9"); scraperwiki::sqliteexecute("drop table if exists ttt"); scraperwiki::sqlitecommit();

There's also a "clear datastore" button on the scraper page, which is useful for starting again during development if the schema is in a mess.

If you like, you can completely ignore the ScraperWiki save command, and construct all your schemas explicitly.

Reading data from other scrapers

To access data from other scrapers we attach to them.

scraperwiki::attach("new_americ_foundation_drone_strikes"); print_r(scraperwiki::select("* from new_americ_foundation_drone_strikes.swdata limit 2"));

To make it easy, you can change the name of the database you import it as.

scraperwiki::attach("new_americ_foundation_drone_strikes", "src"); print_r(scraperwiki::table_info("src.swdata"));

Access to other scrapers data through the attach interface is read-only.