scrape data tiered
-
- Posts: 38
- Joined: Tue Dec 11, 2012 6:44 pm
scrape data tiered
i'm running into a bit of a problem programming a scrape. the scrape extracts data about reservations people have made with us. i think the problem is when people have more than one room reserved. when there is only one room, the data extracts perfectly. a single room reservation is as seen here:
here is an example of a reservation that has multiple rooms booked:
here is what it looks like when the data is extracted though on a reservation with multiple rooms:
as you can see, the data ends up being tiered. is there any way to get it to show up in one row based on occurrence, or some other way?
here is an example of a reservation that has multiple rooms booked:
here is what it looks like when the data is extracted though on a reservation with multiple rooms:
as you can see, the data ends up being tiered. is there any way to get it to show up in one row based on occurrence, or some other way?
Re: scrape data tiered
I have run into same problem. In my case it is something about how I have defined the kind. Scrape worked fine until I reset the kind 'address' and 'country'.
When I ran the scrape originally it wasn't pulling data in any fields past these fields in some records. After I noticed that, I went to those records and added the values of those fields to the kinds. Then when I ran the scrape, the data table was tired.
When I run the scrape without these kinds, table is filled correctly.
May not be the same problem, but I;d love to know what I've done wrong and hat I need to change. Thanks!
When I ran the scrape originally it wasn't pulling data in any fields past these fields in some records. After I noticed that, I went to those records and added the values of those fields to the kinds. Then when I ran the scrape, the data table was tired.
When I run the scrape without these kinds, table is filled correctly.
May not be the same problem, but I;d love to know what I've done wrong and hat I need to change. Thanks!
Re: scrape data tiered
I can't attach the scrape, exceeds file size. ???
Re: scrape data tiered
Use the Force elements into same row premade at File -> Online Premades. As the Heading kind you would use the Room 1, Room 2, etc. items (this will make sense after you read the project's description). Remember that you'll need to recreate all the kinds you use inside these pages (the ones the give you broken data), except the Heading kind, after running the Do Wrap actions tree (again, this will make sense once you read the project's description) and also perform the extraction after running this tree. What the Wrap tree does (and hence the Do Wrap tree does) is modify the HTML so that all elements under each Heading kind are under the same HTML element so that Helium Scraper knows they belong in the same row.
@m1231e: I believe this should apply to your case too. If you need to send a file to big try using this service.
@m1231e: I believe this should apply to your case too. If you need to send a file to big try using this service.
Juan Soldi
The Helium Scraper Team
The Helium Scraper Team
-
- Posts: 38
- Joined: Tue Dec 11, 2012 6:44 pm
Re: scrape data tiered
Thank you so much for the response. When running the "force elements into same row" action tree and setting the Kind to the header (Room 1, Rom 2, etc.), I get the following error message: "Message from webpage: Unknown Runtime Error."
Last edited by crookedleaf on Wed Jan 23, 2013 4:28 pm, edited 1 time in total.
Re: scrape data tiered
I've updated the premade and shouldn't cause those errors now. Just delete the "Wrap" actions tree and import it again.
Juan Soldi
The Helium Scraper Team
The Helium Scraper Team
-
- Posts: 38
- Joined: Tue Dec 11, 2012 6:44 pm
Re: scrape data tiered
thank you so much for the update. after running the action, it causes almost the whole page to go blank. here is a pastebin of the HTML of the page as normal:
http://pastebin.com/pKJgdCPP
http://pastebin.com/pKJgdCPP
-
- Posts: 38
- Joined: Tue Dec 11, 2012 6:44 pm
Re: scrape data tiered
actually it seems be working now! thank you so so much!