March 21, 2011

Use ScraperWiki to Help Turn Web Pages Into Usable Data [Programming]


A scraper is a program written to take content off of a webpage or other data source and turn it into some kind of usable format, usually an RSS feed or by entering it directly into a database. Designing a scraper can be tricky as each site is different, ScraperWiki aims to fix this by creating a repository of these scripts with a goal to ease the pain of designing them.


Read more at Lifehacker
Click Here!