A very bad day.

People often ask me what I do for a living. Here’s a day in my life.

Another day, another program.

Another day, another program.

A server migration is a very stressful process. In an effort to make some dramatic improvements, you have to move hundreds of websites, email, databases, software and other stuff off of one server and install all of this onto another. No matter how much planning you do, something, somewhere, at some time, is going to break, but you won’t know what until after the fact. Sometimes weeks later. Sometimes the issues trickle in. That’s good. Sometimes, it’s a firestorm of angry clients. That would be really bad and something you want to avoid. So, you do a lot of planning, documenting, testing and more testing… for weeks. You get little sleep and no matter how many hours you work in a month, it’s never enough.

During the actual migration, there are thousands of moving parts being juggled in the air. If one is dropped, bad things happen. The stress level and amount of information overload is so high that you must have notes of the simplest processes. I have notes that sound like, “Press the A key, breath in, press the B key and read the line the comes up on the screen. Now exhale.”

Because I run a small company, I’m also multitasking and answering phones, and technical support requests from clients while all this is going on.

In the middle of this, one of the servers I’m working on got hacked.

I got a notice that email was being sent out from one of the servers I received as part of a company purchase. This server is managed by a software package called DirectAdmin. Until a month ago, I’d never even heard of DirectAdmin. Now I’m running a company with it and I’m still learning how it works. I logged in expecting the usual experience of tracking down and stopping the spammer without breaking a sweat. Boy was I wrong.

At first it looked like an email box was compromised and that I had to just change the password on that box to stop this guy. Then it dawned on me that there were dozens of accounts on dozens of domains that were sending spam. It wasn’t going to be a typical spammer day. The hunt turned up some logs that showed that someone with admin level access was creating email boxes all over the server. I changed the password for “admin” to stop this person from doing further damage.

To stop the hemorrhaging I shut down the mail server only to find it started up again a minute later. I rightly assumed DirectAdmin was turning the mail server back on. Without the time to try to figure out how to stop that, I did the next best thing. I wrote a script that killed the mail server every 15 seconds. I then cleared the mail queue of thousands of emails.

Now the problem was, how to figure out the names of all the bogus email accounts this person had created. It didn’t take too long to figure out that he was using a bot to create the email addresses and, thankfully, that he wasn’t very creative. He only used about 12 different email box names across many accounts. The good news is that he used names that were uniquely misspelled such as “servises” and “ofice”. Now I had to find out where DirectAdmin stored these addresses and if there was a way for me to stop this guy from logging into them. With a little help from the previous owner I found the password files and set about a script that would delete only the unique users from these files.

What should have been a straight forward script turns out to be quite a challenge when your heart is racing at 100 miles per hour. A few quick tests of the script to debut it and I let it loose on the accounts. It deleted over 900 email boxes. I was able to turn the mail server back on and there was no more spam leaving the server. It wasn’t over yet.

I started getting some tickets requesting help from clients who couldn’t access their email boxes. Upon review I found that the password files had zero bytes in them. My script had a bug in it that deleted all the user’s email boxes from the password file. Very thankful that I followed the system administration rule number 1 that says, before you delete something from a file, back it up first. I was able remove the bug and get email flowing properly again using the backup files.

I have to give credit to the hacker. The hacker was really good in how this all worked. Without the modest programming skills this would have taken about 15 hours to fix. As it was, it took less than 2. I give God the glory for encouraging me to learn some programming.

Shortly after this I get back to focusing on the migration only to hear from our migrations team that they are having trouble moving certain clients to the new server.

The moments that I get to stop and breath are often spent in prayer.

Comments are closed.