home | list info | list archive | date index | thread index

question about automating download/install of google's "repo" tool

  • Subject: question about automating download/install of google's "repo" tool
  • From: "Robert P. J. Day" <rpjday [ at ] crashcourse [ dot ] ca>
  • Date: Mon, 30 Mar 2020 13:59:44 -0400 (EDT)
  currently having a discussion with a couple colleagues regarding
automating the use of google's repo tool, and part of that involves
the proposal of downloading and storing (or version controlling)
locally the repo launcher tool itself.

  as people familiar with repo will know, repo effectively comes as
two distinct tools (even if bundled in the same script):

  1) the "launcher tool", which is run when you initialize a new
     repo checkout with "repo init", and

  2) the "main" repo tool, which is installed in every initialized
     repo, and which is the one called to do actual repo operations
     on an existing repo

as explained here:

  https://gerrit.googlesource.com/git-repo

to prepare to use repo, the easiest strategy is to copy the launcher
tool to, say, your personal bin directory (and make sure it's in your
search path, of course):

$ mkdir -p ~/.bin
$ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo
$ chmod a+rx ~/.bin/repo

  IOW, every time you run "repo", it will run your personal launcher
tool, and will either:

  1) create a new repo, and install the main repo command in that
repo, or

  2) will recognize you already have a repo, and will invoke the main
"repo" command that is there

  now, in aid of automating as much of this as possible, one of the
proposals is to, ahead of time, download the launcher tool, store it
locally (possibly version controlled), whereupon the automation script
will, for new repos, check out that stored version and take it from
there.

  personally, i prefer the simpler approach of just running these
commands every single time at the top of the automation script to make
sure every developer has their own copy of the launcher tool:

$ mkdir ~/.bin
$ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo
$ chmod a+rx ~/.bin/repo

my attitude is, so what if it downloads a 1000-line script each time?
doing it this way guarantees that one will always have the latest
version from google. i think trying to avoid this by cleverly storing
a local copy is way overkill, and introduces the possibility of a
local copy getting out of date.

  anyway, thoughts?

rday

To unsubscribe send a blank message to linux+unsubscribe [ at ] linux-ottawa [ dot ] org
To get help send a blank message to linux+help [ at ] linux-ottawa [ dot ] org
To visit the archives: https://lists.linux-ottawa.org