Discussion:
Offline systems
(too old to reply)
Peter
2020-09-20 14:50:02 UTC
Permalink
Hi,

I want to setup a few servers(30+) on intranet network and NONE(not
even a master server or anything like that) of them having internet access.

What is best approach so I can still install packages on demand ?

Try to make custom repo ? But in this case what will be the proper
procedure to update the repo ?

What is your advice ?

Thank you,

Peter
Dan Ritter
2020-09-20 15:10:01 UTC
Permalink
Post by Peter
Hi,
I want to setup a few servers(30+) on intranet network and NONE(not
even a master server or anything like that) of them having internet access.
What is best approach so I can still install packages on demand ?
Try to make custom repo ? But in this case what will be the proper
procedure to update the repo ?
What is your advice ?
I'm assuming you have internet access somewhere else, and can
regularly download packages to removable media, then bring it
back to your isolated network.

You can use apt-mirror to bring over new packages to the
removable media, then move it over to a server on your network
which can be set up as the apt server for the other machines.

aptly may be useful in accepting the new packages and publishing
them locally.

-dsr-
Hans
2020-09-20 15:10:01 UTC
Permalink
Am Sonntag, 20. September 2020, 16:46:04 CEST schrieb Peter:
Hi,

I would install with the full-install dvd, not netinst. Then you could setup
an own repo (by mirroring the one you need).

At last point the servers to your own repo server.

However, besides, i personally would never trust those servers when someone
else is adminitrating them, especially within a company. Do you really know,
if those packages are not compromised to spy at the eployees???

Best

Hans
Post by Peter
Hi,
I want to setup a few servers(30+) on intranet network and NONE(not
even a master server or anything like that) of them having internet access.
What is best approach so I can still install packages on demand ?
Try to make custom repo ? But in this case what will be the proper
procedure to update the repo ?
What is your advice ?
Thank you,
Peter
Peter
2020-09-20 15:50:02 UTC
Permalink
Hi ,

I will be administering all the servers. I can trust them, I just cannot
have internet in that net. The number of servers can go up to 70.
Also servers are in different location with relatively slow(1-5 Mb/s)
connections.

Do you see any drawbacks if I copy the 3 DVD of Debian 10 on each
machine initially, mount them and use them if I want to to install
something e.g nano at some point ?

Thanks,

Peter
Post by Hans
Hi,
I would install with the full-install dvd, not netinst. Then you could setup
an own repo (by mirroring the one you need).
At last point the servers to your own repo server.
However, besides, i personally would never trust those servers when someone
else is adminitrating them, especially within a company. Do you really know,
if those packages are not compromised to spy at the eployees???
Best
Hans
Post by Peter
Hi,
I want to setup a few servers(30+) on intranet network and NONE(not
even a master server or anything like that) of them having internet access.
What is best approach so I can still install packages on demand ?
Try to make custom repo ? But in this case what will be the proper
procedure to update the repo ?
What is your advice ?
Thank you,
Peter
deloptes
2020-09-20 22:50:02 UTC
Permalink
Post by Peter
Do you see any drawbacks if I copy the 3 DVD of Debian 10 on each
machine initially, mount them and use them if I want to to install
something e.g nano at some point ?
I don't see any issue with that. I copy the iso images, mount them and add
them as source. I think 1-5Mbps is still good enough for using one single
server as a source, but if you have diskspace, you could do on each of
them.
mick crane
2020-09-20 15:20:01 UTC
Permalink
Post by Peter
Hi,
I want to setup a few servers(30+) on intranet network and NONE(not
even a master server or anything like that) of them having internet access.
What is best approach so I can still install packages on demand ?
Try to make custom repo ? But in this case what will be the proper
procedure to update the repo ?
What is your advice ?
You'll want "something" connected to the internet to get updates.
Somebody mentioned jigdo which looked like a good thing.
Depends how paranoid you are. You could take disk or whatever from that
separate, connected to internet machine and use that as your repository
I imagine.

mick
--
Key ID 4BFEBB31
Thomas Schmitt
2020-09-20 15:40:02 UTC
Permalink
Hi,
Post by mick crane
Somebody mentioned jigdo which looked like a good thing.
Jigdo is used for making ISO images from a frame of ISO 9660 metadata and
other non-packaged stuff (the .template file), and the .deb packages on a
mirror, or in a repository, or in an older ISO image.
The .jigdo file contains the list of packages which shall be inserted into
the frame.

So except that you get them wrapped in an ISO image, the packages are just
the same as on the mirror server (or other package source) from which you
could fetch them by appropriate means.


Have a nice day :)

Thomas
mick crane
2020-09-20 15:50:02 UTC
Permalink
Post by Thomas Schmitt
Hi,
Post by mick crane
Somebody mentioned jigdo which looked like a good thing.
Jigdo is used for making ISO images from a frame of ISO 9660 metadata and
other non-packaged stuff (the .template file), and the .deb packages on a
mirror, or in a repository, or in an older ISO image.
The .jigdo file contains the list of packages which shall be inserted into
the frame.
So except that you get them wrapped in an ISO image, the packages are just
the same as on the mirror server (or other package source) from which you
could fetch them by appropriate means.
Have a nice day :)
Thomas
What's the best appropriate means then to fetch mirror and then only
fetch differences to local copy ? wget does that doesn't it ?
Not that I want to but wondered.

mick
--
Key ID 4BFEBB31
Thomas Schmitt
2020-09-20 16:10:01 UTC
Permalink
Hi,
What's the best appropriate means then to fetch mirror and then only fetch
differences to local copy ? wget does that doesn't it ?
That's indeed an old selling point of Jigdo: No need to download packages
which you already have.

But you need a pair of .jigdo and .template files which Debian produces for
its ISOs in order to get a new download list. And in the end the new ISO
will be as big as if all packages had been downloaded.

Whatever, Jigdo might have a role in this thread.
Do you see any drawbacks if I copy the 3 DVD of Debian 10 on each machine
initially, mount them and use them if I want to to install something e.g
nano at some point ?
Should work. But you might get better performance if you unpack the ISO's
"pool" trees into a common "pool" tree on hard disk. One level of mount less.

Jigdo could bring you bigger single ISOs for either of the approaches.
16 GB: https://cdimage.debian.org/debian-cd/current/amd64/jigdo-16G/
25 GB: https://cdimage.debian.org/debian-cd/current/amd64/jigdo-bd/
50 GB: https://cdimage.debian.org/debian-cd/current/amd64/jigdo-dlbd/

(I wonder what software ends up in DLBD-2 or BD-4 ...)


Have a nice day :)

Thomas
Andrew Cater
2020-09-20 16:30:01 UTC
Permalink
Myself - I might use the 16G stick as install medium - see conversations
elsewhere in this list on how to use jigod. That gives you the contents of
more than DVD 1 to DVD 3 in one small format. These are up to 70 real,
physical servers and not VMs? You might want to look at automated ways to
deploy and build that many servers.

What you're asking is possible - but not necessarily a good move: Debian
changes daily with fixes, security fixes and so on. Unless you are going to
build them and then walk away completely - build a mirror which is
dual-homed: you can connect it to the internet to pull in updates very
regularly, then disconnect form the internet and use this machine to update
all of the others.

All the very best, as ever,

Andy C
Post by mick crane
Post by Thomas Schmitt
Hi,
Post by mick crane
Somebody mentioned jigdo which looked like a good thing.
Jigdo is used for making ISO images from a frame of ISO 9660 metadata and
other non-packaged stuff (the .template file), and the .deb packages on a
mirror, or in a repository, or in an older ISO image.
The .jigdo file contains the list of packages which shall be inserted into
the frame.
So except that you get them wrapped in an ISO image, the packages are just
the same as on the mirror server (or other package source) from which you
could fetch them by appropriate means.
Have a nice day :)
Thomas
What's the best appropriate means then to fetch mirror and then only
fetch differences to local copy ? wget does that doesn't it ?
Not that I want to but wondered.
mick
--
Key ID 4BFEBB31
Peter
2020-09-21 05:10:03 UTC
Permalink
Hi,
Post by Andrew Cater
These are up to 70
real, physical servers and not VMs? You might want to look at automated
ways to deploy and build that many servers.
All 70 will be physical. I think about making 1 single server then using
hdclone or acronis to make copy of it to the rest. Then write a small
bash script for the onsite tech to change IP. Better ideas ? :-)

Peter
Post by Andrew Cater
Myself - I might use the 16G stick as install medium - see conversations
elsewhere in this list on how to use jigod. That gives you the contents
of more than DVD 1 to DVD 3 in one small format. These are up to 70
real, physical servers and not VMs? You might want to look at automated
ways to deploy and build that many servers.
What you're asking is possible - but not necessarily a good move: Debian
changes daily with fixes, security fixes and so on. Unless you are going
to build them and then walk away completely - build a mirror which is
dual-homed: you can connect it to the internet to pull in updates very
regularly, then disconnect form the internet and use this machine to
update all of the others.
All the very best, as ever,
Andy C
Post by Thomas Schmitt
Hi,
Post by mick crane
Somebody mentioned jigdo which looked like a good thing.
Jigdo is used for making ISO images from a frame of ISO 9660
metadata
Post by Thomas Schmitt
and
other non-packaged stuff (the .template file), and the .deb
packages on
Post by Thomas Schmitt
a
mirror, or in a repository, or in an older ISO image.
The .jigdo file contains the list of packages which shall be
inserted
Post by Thomas Schmitt
into
the frame.
So except that you get them wrapped in an ISO image, the packages
are
Post by Thomas Schmitt
just
the same as on the mirror server (or other package source) from
which
Post by Thomas Schmitt
you
could fetch them by appropriate means.
Have a nice day :)
Thomas
What's the best appropriate means then to fetch mirror and then only
fetch differences to local copy ? wget does that doesn't it ?
Not that I want to but wondered.
mick
--
Key ID 4BFEBB31
David Christensen
2020-09-21 05:20:02 UTC
Permalink
Post by Peter
Hi,
Post by Andrew Cater
These are up to 70
real, physical servers and not VMs? You might want to look at automated
ways to deploy and build that many servers.
All 70 will be physical. I think about making 1 single server then using
hdclone or acronis to make copy of it to the rest. Then write a small
bash script for the onsite tech to change IP. Better ideas ? :-)
https://www.clonezilla.org/

(They currently seem to be having problems with their https certificate.)


David
Andrei POPESCU
2020-09-21 06:50:01 UTC
Permalink
Post by Peter
Hi,
Post by Andrew Cater
These are up to 70
real, physical servers and not VMs? You might want to look at automated
ways to deploy and build that many servers.
All 70 will be physical. I think about making 1 single server then using
hdclone or acronis to make copy of it to the rest. Then write a small bash
script for the onsite tech to change IP. Better ideas ? :-)
There's also ssh keys, partition UUIDs and other things that should be
changed and/or regenerated.

FAI[1] was explicitly designed for your use case.

To manage the servers after installation you might want something like
puppet, ansible, chef, salt, etc.

[1] https://fai-project.org

Kind regards,
Andrei
--
http://wiki.debian.org/FAQsFromDebianUser
deloptes
2020-09-21 10:10:06 UTC
Permalink
Post by Peter
All 70 will be physical. I think about making 1 single server then using
hdclone or acronis to make copy of it to the rest. Then write a small
bash script for the onsite tech to change IP. Better ideas ? :-)
ansible
you invest time to configure (perhaps to learn as well)
you save time on maintenance

or ansible alternatives like puppet or openstack

Loading...