On Wed, Sep 26, 2012 at 4:36 PM, Shawn H Corey <shawnhcorey [ at ] gmail [ dot ] com> wrote:
> On Wed, 26 Sep 2012 15:43:26 -0400 (EDT)
> "Robert P. J. Day" <rpjday [ at ] crashcourse [ dot ] ca> wrote:
>
>> for anyone who knows perl, i'm sure this will be trivial given the
>> right perl module. if anyone can help me out, i would be grateful to
>> the extent of a beer or three, or something along those lines.
>
> Right of the bat, I think you will need WWW::Mechanize to scrap the
> website, and PDF::API2 to create the PDF.
>
> I don't think your project is going to be as simple as you do.
I'ld cheat:
#!/bin/bash
page=0
while read URL
do
page=$((page+1))
if [ "$page" ne "SKIP" ]; then
html2ps -o - $URL | ps2pdf - $(printf page-%03d.pdf $page)
fi
done
(warning, typed into an email client, not a shell script, so gotcha's
not avoided)
--
Aidan Van Dyk Create like a god,
aidan [ at ] highrise [ dot ] ca command like a king,
http://www.highrise.ca/ work like a slave.