home | list info | list archive | date index | thread index

Re: [OCLUG-Tech] looking for perl(?) script to convert list of URLs to PDF files

  • Subject: Re: [OCLUG-Tech] looking for perl(?) script to convert list of URLs to PDF files
  • From: "Robert P. J. Day" <rpjday [ at ] crashcourse [ dot ] ca>
  • Date: Wed, 26 Sep 2012 17:03:34 -0400 (EDT)
On Wed, 26 Sep 2012, yanick [ at ] babyl [ dot ] dyndns [ dot ] org wrote:

> On Wed, Sep 26, 2012 at 06:16:31PM -0400, Champoux wrote:
> > On Wed, Sep 26, 2012 at 03:43:26PM -0400, Robert P. J. Day wrote:
> > >   i have a text file containing over 100 URLs (one per line), and i
> > > want to produce that many PDF files of those URL pages, with output
> > > filenames simply 1.pdf, 2.pdf and so on -- that numbering is
> > > required.
> >
> > 	$ perl -MLWP::Simple=mirror -ne'next if /^SKIP/; \
> > 	    mirror( $_ => $..".pdf" ); print "saving $..pdf\n"' < list
> >
> > 	?
>
> 	Wait, I might have misunderstood. Are the original links
> not pdfs, but you want to convert them on the fly?

  exactly.  all the URLs are simply htm/html/shtml, and i want to:

  1) grab them
  2) convert to PDF
  3) save to sequentially numbered files

i'm effectively trying to do what you can do in the chromium browser
with "Print > File", saving as pdf.  i just don't want to do it
manually over 100 times.

> 	If so, yeah, it's going to be a mite more complicated. I did
> something similar in the past, if it can help:
> http://babyl.dyndns.org/techblog/entry/contact-sheet

  ok, i'll check it out, thanks.

rday

-- 

========================================================================
Robert P. J. Day                                 Ottawa, Ontario, CANADA
                        http://crashcourse.ca

Twitter:                                       http://twitter.com/rpjday
LinkedIn:                               http://ca.linkedin.com/in/rpjday
========================================================================