Hi there. I just started downloading the WHOLE of an online catalogue, by using:
(just an example URL)
I now have a folder FULL of HTML pages, one for each product. I need to filter out the same data for ALL products on each page, and send that to a CSV file.
Can anyone tell me how I would do this, giving examples for someone who has SOME Unix command line experience (mediocre) but not a "Guru"?.
Thanks
Code:
wget -rH -Dserver.com http://www.server.com/
(just an example URL)
I now have a folder FULL of HTML pages, one for each product. I need to filter out the same data for ALL products on each page, and send that to a CSV file.
Can anyone tell me how I would do this, giving examples for someone who has SOME Unix command line experience (mediocre) but not a "Guru"?.
Thanks