home | list info | list archive | date index | thread index

Re: [OCLUG-Tech] looking for perl(?) script to convert list of URLs to PDF files

  • Subject: Re: [OCLUG-Tech] looking for perl(?) script to convert list of URLs to PDF files
  • From: Aidan Van Dyk <aidan [ at ] highrise [ dot ] ca>
  • Date: Wed, 26 Sep 2012 16:59:18 -0400
On Wed, Sep 26, 2012 at 4:53 PM, Robert P. J. Day <rpjday [ at ] crashcourse [ dot ] ca> wrote:

>> I'ld cheat:
>>
>>    #!/bin/bash
>>    page=0
>>    while read URL
>>    do
>>       page=$((page+1))
>>       if [ "$page" ne "SKIP" ]; then
>>          html2ps -o - $URL | ps2pdf - $(printf page-%03d.pdf $page)
>
>   hmmmmm ... that has potential but a simple test of html2ps shows
> that it doesn't accurately portray, say, oclug.on.ca.

html2ps is basic.  It's options help, but it doesn't do CSS (at least,
now when I used it).  I use it on a few pages (CGI outputs) regularly,
but I'll admit to customizing my HTML to make it work well with
HTML2ps.

But, with that scheme, s/html2ps/$ANY_CONVERTER_OF_YOUR_CHOICE/

Like:
    http://code.google.com/p/wkhtmltopdf/

a.


-- 
Aidan Van Dyk                                             Create like a god,
aidan [ at ] highrise [ dot ] ca                                       command like a king,
http://www.highrise.ca/                                   work like a slave.