home | list info | list archive | date index | thread index

Re: [OCLUG-Tech] looking for perl(?) script to convert list of URLs to PDF files

  • Subject: Re: [OCLUG-Tech] looking for perl(?) script to convert list of URLs to PDF files
  • From: "Robert P. J. Day" <rpjday [ at ] crashcourse [ dot ] ca>
  • Date: Wed, 26 Sep 2012 18:02:53 -0400 (EDT)
On Wed, 26 Sep 2012, Aidan Van Dyk wrote:

> On Wed, Sep 26, 2012 at 4:53 PM, Robert P. J. Day <rpjday [ at ] crashcourse [ dot ] ca> wrote:
>
> >> I'ld cheat:
> >>
> >>    #!/bin/bash
> >>    page=0
> >>    while read URL
> >>    do
> >>       page=$((page+1))
> >>       if [ "$page" ne "SKIP" ]; then
> >>          html2ps -o - $URL | ps2pdf - $(printf page-%03d.pdf $page)
> >
> >   hmmmmm ... that has potential but a simple test of html2ps shows
> > that it doesn't accurately portray, say, oclug.on.ca.
>
> html2ps is basic.  It's options help, but it doesn't do CSS (at least,
> now when I used it).  I use it on a few pages (CGI outputs) regularly,
> but I'll admit to customizing my HTML to make it work well with
> HTML2ps.
>
> But, with that scheme, s/html2ps/$ANY_CONVERTER_OF_YOUR_CHOICE/
>
> Like:
>     http://code.google.com/p/wkhtmltopdf/

  just FYI, this was the solution, for which i am immensely grateful.

rday

-- 

========================================================================
Robert P. J. Day                                 Ottawa, Ontario, CANADA
                        http://crashcourse.ca

Twitter:                                       http://twitter.com/rpjday
LinkedIn:                               http://ca.linkedin.com/in/rpjday
========================================================================