Sunday, July 2, 2017

Windows Subsystem for Linux / Bash on Ubuntu on Windows

I'm reinstalling Bash on Ubuntu on Windows on my work laptop at home, where I'm not behind the work firewall and the need for the work proxy server.
WSL happily activates and downloads Ubuntu from the Windows store while at work, but once it fires up Ubuntu and starts running Apt to install updates, it chokes, as Ubuntu, and Apt, aren't configured to use the proxy server. This means I have to cancel the install, which results in a working system, but it doesn't complete its setup. I can fire up Bash, but it always logs in as root (doesn't get to the user setup step). Once logged in I can configure Apt to use the proxy, and set the proxy environment and run the apt updates, but it still hasn't gone through the full install process cleanly. This is a weakness in Microsoft / Canonical's design. Ubuntu should either inherit the proxy config from Windows, or have a way to configure it in the setup, so it can perform a clean install.
I figured I'd give it a try from home, where it doesn't need to go through a proxy, and see if it will properly complete the install. This worked perfectly on my personal laptop.

Edited: We have success! The prompt for a username means we got past the blocker seen at work.

Bash on Ubuntu on Windows install username prompt

Sunday, June 11, 2017

Bad SSL security

I see GNS3 Academy still hasn't fixed their SSL certificate.

For a site teaching about networking, which includes network security, this is head-shakingly bad.

Saturday, June 10, 2017

Minor Home Network Rewiring

After some minor home network rewiring (2 additional Cat6 cables from network rack to desk, re-tipped all Ethernet cables with keystone jacks, installed in a 4-gang surface mount box, patch cable from the computers to the new keystone jacks. Unfortunately one of the original cables I had running between the rack and my desk is only 20', so it is now too short for the new path, so only 4 actual connections to the desk. I'll replace that later.) I'm rather pleased with my Internet performance today:

23 ms ping, 49.45 Mbps Download, 5.44 Mbps Upload, AT&T Internet, Keller, TX, < 50 mi

To Do: Install patch panel in the network rack, re-tip these cables into the back of the patch panel, install patch cables from panel to switch. Re-tip the cables going from the network rack to the Cisco lab bench the same way. Install some split loom or spiral conduit around these cable runs to keep them dressed neatly.

Keystone Jacks
4-gang surface mount box
25' cat-6 Ethernet cables
5' split loom. I should have ordered longer

Thursday, June 8, 2017

A Day in the Life of a Unix Geek

Wow, been almost a year since I blogged anything. I'm getting lazy.

So what's the daily life of a systems administrator like? Here was today:

The plan coming in in the morning: Begin quarterly "Vulnerability audit report".

What did I do?
Windows server starts alerting on CPU at midnight, again. We fixed the problem on Tues. Why is it alerting again?
Of course, it corrects itself before I can get logged in and doesn't go off again all day. Send an email to the person responsible for the application on that server to ask if the app was running any unusually CPU intensive jobs. Respond with a screenshot showing times CPU alerts went off. Get response of "nothing unusual". As usual.

We updated the root password on all Unix servers last week. Get a list of 44 systems from a coworker that still have the old root password.
Check the list, confirm all still have the old root password.
Check the list against systems that were updated via Ansible. All on the Ansible list. No failures when running the Ansible playbook to update the root password. All spot-checks that the new root password was in effect at the time showed task was working as expected.
Begin investigating why these systems still have the old root password.
Speculation during team scrum that Puppet might be resetting the root password.
Begin testing a hypothesis that root password was, in fact, changed, but something else is re-setting it back to the old password.
Manually update root password on one host. Monitor /etc/shadow to see if it changes again after setting the password. (watch -d ls -l /etc/shadow)
Wait.
Wait.
Wait some more.
Wait 27 minutes, BOOM! /etc/shadow gets touched.
Investigate to see if Puppet is the culprit. I know nothing about Puppet. I'm an Ansible guy. The puppet guy (who knows just enough to have set up the server and built some manifests and get Puppet to update root the last time the root password was changed before I started working here.) is out today.
Look at log files in /var/log. Look at files in /etc/puppet on puppet server. Try to find anything that mentions "passw(or)?d&&root" (did I mention I'm not a puppet guy?). Find a manifest that says something about setting the root password, but it references a variable. Can't find where the value of that variable is set.
Look some more at the target host. See in log files that it's failing to talk to the Puppet server, so continuing to enforce the last set of configuration stuff it got. Great, fixing this on the Puppet server won't necessarily fix all the clients that have been allowed to lose connectivity that no one noticed (entropy can be a bitch.)
Begin looking at what to change on the client (other than just "shut down the Puppet service" and "kill it with fire!"). Realize it's much faster to surf all the files and directories involved with "mc".
Midnight Commander not installed. Simple enough, "yum install mc".
Yum: "What, you want to install something in the base RHEL repo? HAH! Entropy, baby! I have no idea what's in the base repo.".
Me: <cracking knuckles> "Hold my beer." (This is Texas, Y'all.)
(No, not really. CTO frowns on drinking during work hours, or drinking while logged into production systems. Or just drinking while logged in...)
OK, so more like:
Me: </cracking><cracking knuckles> "Hold my Diet Coke."
Yum: "Red Hat repos? We don't need no steeeenking Red Hat repos!"
Me: <rolling up sleeves, fixes yum config that somehow had rhnplugin turned off.>

Start updating Yum repo cache. Run out of space in /var. Discover when this server was built, it was built with much too small a /var. Start looking at what to clean up.
Fix logrotate to compress log files when it rotates them, manually compress old log files.
/var/lib/clamav is one of the larger directories. Oh, look, several failed DB updates that never got cleaned up.
Clean up the directory, run freshclam. Gee, ClamAV DB downloads sure are taking a long time given that it's got a GigE connection to the local DatabaseMirror. Check Freshclam config. Yup, the local mirror is configured... external mirror ALSO configured. Dang it. Fix that. ClamAV DB updates no much faster.
Run yum repo cache update again. Run out of disk space again. Wait... why didn't Nagios alert that /var was full?
Oh, look, when /var was made a separate partition, no one updated Nagios to monitor it.
Log into Nagios server to update config file for this host. Check changes into Git. Discover there have been a number of other Nagios changes lately that haven't been checked into Git. Spend half an hour running git status / diff / add / delete / commit / push to get all changes checked into Git repo.
Restart Nagios server (it doesn't like reloads. Every once in a while it goes bonkers and sends out "The sky is falling! ALL services on ALL servers are down! Run for your lives! The End is nigh!" if you try a simple reload.
Hmm... if Nagios is out of date for this host, is Cacti...
Update yum cache again. Run out of disk space again.
<sigh> Good thing this is a VM, with LVM. Add another drive in vSphere, pvcreate, swing your partner, vgextend, lvresize -r, do-si-do!
yum repo cache update... FINALLY!
What was I doing again? Oh, right, install Midnight Commander...
Why? Oh yeah, searching for a Puppet file for....?
Right, root password override.

Every time I log into a server it seems like I find a half dozen things that need fixing. Makes you not want to log into anything, so you can actually get some work done. Oh, right, entropy...</sigh></rolling></cracking>