Testing problems with Telnet

There are many different ways to test whether a network port is listening on a system, including GUI port scanners, Nmap and nc. Although all of those can work well, and even I find myself using Nmap more often than not, not all machines end up having Nmap installed. Just about every system includes telnet though. So if I wanted to test whether the SMTP port (port 25) was listening on a server with the IP 192.168.5.5, I could type:

$ telnet 192.168.5.5 25
Trying 192.168.5.5…
telnet: Unable to connect to remote host: Connection refused
In this case, the remote port is unavailable, so I would fall back to some other troubleshooting methods to figure out why. If the port were open and available though, I could just start typing SMTP commands (more on that later).

As you can see from the above example, the syntax is to type the command telnet, the IP or hostname to connect to, and the remote port (otherwise it will default to port 23—the default port for telnet). So if I wanted to test a Web server instead, I would connect to the HTTP port (port 80):

$ telnet www.example.net 80
Troubleshoot Web Servers

While you are connecting to port 80, you might as well actually throw some HTTP commands at it and test that it works. For starters, you want to make sure you actually are connected:

$ telnet www.example.net 80
Trying 192.168.5.5…
Connected to www.example.net.
Escape character is ‘^]’.
Once you are connected, you can pass a basic HTTP GET request to ask for the default index page followed by the host you want to connect to:

GET / HTTP/1.1
host: www.example.net
The GET request specifies which page (/) along with what protocol you will use (HTTP/1.1). Since these days most Web servers end up hosting multiple virtual hosts from the same port, you can use the host command so the Web server knows which virtual host to direct you to. If you wanted to load some other Web page, you could replace GET / with, say, GET /forum/. It’s possible your connection will time out if you don’t type it in fast enough—if that happens, you always can copy and paste the command instead. After you type your commands, press Enter one final time, and you’ll get a lot of headers you don’t normally see along with the actual HTML content:

HTTP/1.1 200 OK
Date: Tue, 10 Jul 2012 04:54:04 GMT
Server: Apache/2.2.14 (Ubuntu)
Last-Modified: Mon, 24 May 2010 21:33:10 GMT
ETag: “38111c-b1-4875dc9938880″
Accept-Ranges: bytes
Content-Length: 177
Vary: Accept-Encoding
Content-Type: text/html
X-Pad: avoid browser bug


                        

It works!

This is the default web page for this server.

The web server software is running but no content has been added, yet.

As you can see from my output, this is just the default Apache Web server page, but in this case, the HTML output is only one part of the equation. Equally useful in this output are all of the headers you get back from the HTTP/1.1 200 OK reply code to the modification dates on the Web page, to the Apache server version. After you are done sending commands, just press Ctrl-] and Enter to get back to a telnet prompt, then type quit to exit telnet.

Send an E-mail

Although I just use telnet for basic Web server troubleshooting, telnet ends up being my preferred tool for e-mail troubleshooting, mostly because it’s so simple to send a complete e-mail with only a few telnet commands.

The first step is to initiate a telnet connection with the mail server you want to test on port 25:

$ telnet mail.example.net 25
Trying 192.168.5.5…
Connected to mail.example.net.
Escape character is ‘^]’.
220 mail.example.net ESMTP Postfix
Unlike the blank prompt you may get when you connect to an HTTP server, with SMTP, you should get an immediate reply back. In this case, the reply is telling me I’m connecting to a Postfix server. Once I get that 220 prompt, I can start typing SMTP commands, starting with the HELO command that lets me tell the mail server what server is connecting to it:

HELO lappy486.example.net
250 mail.example.net
The nice thing about the interactive SMTP connection here is that if I do somehow make a typo in a command or make a mistake, it should let me know; otherwise, I should get a 250 reply. After HELO, you use the MAIL FROM: command to list what e-mail address the e-mail should appear to be from. I say appear to be from, because you can put just about any e-mail address you want here, which is a good reason not to blindly trust FROM addresses:

MAIL FROM:
250 Ok
In the past, I used to type in the e-mail address directly without surrounding it with <>. My personal Postfix servers are fine with this, but other mail servers are more strict and will reply with a syntax error if you don’t surround the e-mail address with <>. Since this FROM address was accepted, you can follow up with RCPT TO: and specify who the e-mail is addressed to:

RCPT TO: 250 Ok
The fact that the mail server responded with 250 should mean that it accepted the TO address you specified here. Finally, you can type DATA and type the rest of your e-mail, including any extra headers you want to add, like Subject, then finish up with a single period on its own line:

DATA
354 End data with .
Subject: Give Telnet a Chance 1
Hi,

All we are saying is give telnet a chance.
.
250 Ok: queued as 52A1EE3D117
When I’m testing e-mails with telnet, I usually put a number in the subject line so I can continually increment it with each test. This way, if some e-mail messages don’t get delivered, I can tell which ones went through and which ones didn’t.

Once you are done with the DATA section and the e-mail is queued, you can type quit to exit:

quit
221 Bye
Connection closed by foreign host.

Telnet may be a little old hat now but adding another tool to your armoury of troubleshooting is never a bad thing.

Posted in Managed Hosting | Leave a comment

Amazon CloudWatch Monitoring Scripts for Linux

Amazon have extended the CloudWatch monitoring to include some Linux scripts. These scripts run in the background and can push system metrics to CloudWatch.

The following metrics can be obtained:

  • Memory Utilisation – Memory allocated by applications and the operating system, exclusive of caches and buffers, in percentages.
  • Memory Used – Memory allocated by applications and the operating system,in megabytes.
  • Memory Available – System memory available for applications and the operating system, in megabytes.
  • Disk Space Utilisation – Disk space usage as percentages.
  • Disk Space Used – Disk space usage in gigabytes.
  • Disk Space Available – Available disk space in gigabytes.
  • Swap Space Utilisation – Swap space usage as a percentage.
  • Swap Space Used – Swap space usage in megabytes.

Note: Disk space for one or more mount points or directories can be reported on.
Note: CloudWatch only stores data for 2 weeks.

The scripts will use IAM (Identity and Access Management) to submit the data to CloudWatch.
Output from the scripts can be used with AutoScaling (e.g. scale up if memory utilisation is high.)
Aggregated metrics can be obtained from multiple instances (e.g. total memory usage across all EC2 instances.)

More information here

Posted in Managed Hosting | Leave a comment

Remote rebuild – Centos/RHEL

Until now the rebuild of a server meant to me burning the disc or preparing a USB stick and going for a trip to wherever the machine was hosted. Now I have learned how to rebuild remotely a (working and ssh’able!!) machine, things are about to change.

As mentioned just above this way of rebuilding works only for servers that are already accessible – this is due to need of using SSH to reconfigure GRUB/Kernel options.

Preparation.

  1. Locate one of the Centos/Red Hat mirrors that could be used to network install – in my example I’ll install Centos 6.3 and use this mirror: http://mirror.stshosting.co.uk/centos/6.3/os/x86_64/
  2. Login to the machine that needs rebuilding
  3. 3. Enter /boot/ folder/partition
    cd /boot/
  4. Download installation copy of initrd.img and rename it to initrd.img-install, run:
    wget -O 'initrd.img-install' http://mirror.stshosting.co.uk/centos/6.3/os/x86_64/isolinux/initrd.img
  5. Download installation copy of vmlinuz and rename it to vmlinuz-install, run:
    wget -O 'vmlinuz-install' http://mirror.stshosting.co.uk/centos/6.3/os/x86_64/isolinux/vmlinuz</li>

  6. Just to be on the safe side copy the current version of grub.conf – this may be useful when you decide last minute that you don't want to rebuild it but recover the machine to previous state, run:
    /boot/grub/grub.conf /boot/grub/grub.conf-original
  7. And finally create new grub.conf with:

cat > /boot/grub/grub.conf << "EOF"
default=0
timeout=0
hiddenmenu

title Install Centos 6.3
root (hd0,0)
kernel /vmlinuz-install ip=192.168.1.50 netmask=255.255.255.0 gateway=192.168.1.254 dns=8.8.8.8 repo=http://mirror.stshosting.co.uk/centos/6.3/os/x86_64/ vnc vncpassword=C3nt0S lang=en_US keymap=uk sshd
initrd /initrd.img-install
EOF

DON'T FORGET TO ADJUST KERNEL OPTIONS TO MATCH YOUR ENVIRONMENT – the machine will not be usable if after reboot Anaconda won't be able to connect to internet.

Once you made sure all the network setting are OK restart the machine – it will boot into installer and Anaconda starting the installation to which you'll be able to connect to with VNC client using the ip (in my case it would be 192.168.1.50 and port number 5901. Give it few minutes before you try to connect – graphical installer won't be the first thing started after reboot.

For explanation of the options used with kernel please visit following links:

https://fedoraproject.org/wiki/Anaconda_Boot_Options?rd=Anaconda/Options
http://www.linuxtopia.org/online_books/rhel6/rhel_6_installation/rhel_6_installation_sn-medialess-editing-grub-conf.html

When browsing through kernel/anaconda options you may notice that there is 'vncport' which should allow to specify your own port for VNC connection. However, this one doesn't get recognized in Anaconda version used for Centos 6.3 installation and the installation fails leaving you with unreachable machine.

Good luck

Posted in Managed Hosting | Leave a comment

Incorrect Mail Quotas – cPanel

Recently, I noticed an issue with cPanel and mail quotas. The problem was, cPanel was showing that an account was over quota when there was actually no mail.

This issue stems from a rare issue with corrupt maildirsize files. The fix is simple and just requires this command being run:

# /scripts/generate_maildirsize --confirm [account_name]

That will regenerate maildirsize and will instantly recalculate the amount of mail in an account thus rectifing the false report of it being over quota.

The reason this corrupt maildirsize is such an issue is that cPanel will think that the account is over quota and therefore reject mail coming in for that account.

Posted in Managed Hosting | Leave a comment

Removing and testing weak SSL Ciphers

SSL certificates form the backbone of online shopping, as they provide the vendor verification and data encryption that makes e-commerce a safe and viable retail option.

However, due to support for older, legacy encryption methods, the current implementations of the SSL/TLS protocols retain support for much weaker ciphers than are now permissible under PCI (Payment Card Industry) regulations. So, any vendor wishing to maintain PCI compliance will need to ensure their server does not accept these low level ciphers – typically those using 64 bit or lower encryption. Any ciphers that allow anonymous or unencrypted connections also need to be removed.

On a typical LAMP server, running Apache/mod_ssl, you should ensure your httpd.conf or ssl.conf files have the following directives:

SSLProtocol -ALL +SSLv3 +TLSv1
SSLCipherSuite ALL:!aNULL:!ADH:!eNULL:!LOW:!EXP:RC4+RSA:+HIGH:+MEDIUM

Note: Any option prefixed with ! is removed from the cipher list. Any option prefixed with + is added/enabled in the cipher list.

The first line ensures only the most recent versions of the SSL/TSL protocols are used (SSLv3 and TLSv1), and all other versions are disabled using the -ALL switch.

The second line defines which ciphers are used. As you can see, LOW ciphers are removed, as are anonymous connections (aNULL) and connections offering no encryption (eNULL). Only HIGH and MEDIUM level ciphers are permitted – those using 128 bit encryption upwards.

Apache will need to be restarted after any changes, to make them live.

You can then use openssl command line tests to check that the weak ciphers have been disabled.

Run the following command, replacing <your_domain_name> with the domain name of a site secured by an SSL certificate:

echo ‘GET HTTP/1.0′ | openssl s_client -cipher LOW -connect <your_domain_name>:443

If the weak ciphers have been disabled, you should get a ‘handshake error’, and no certificate information will be returned.

The -cipher switch forces the test to only use the specified level of cipher, so you can replace LOW with aNULL and eNULL, and test them too.

Next, try running:

echo ‘GET HTTP/1.0′ | openssl s_client -cipher HIGH -connect <your_domain_name>:443

As this forces the use of high level ciphers, the connection should be successful and return valid certificate information. You can run the test again, replacing HIGH with MEDIUM, to ensure medium level ciphers are also permitted.

Assuming all test results are as indicated, your server will now only be using PCI compliant SSL ciphers.

Further reading:

http://httpd.apache.org/docs/2.2/mod/mod_ssl.html

http://www.openssl.org/docs/apps/ciphers.html

 

Posted in Managed Hosting | Leave a comment

MariaDB

We have recently been looking into MariaDB as an alternative to MySQL, since it seems to be all over the place at the moment with organisations like Wikipedia and Fedora investigating its use it looks as though its going to make an impact sooner rather than later.

As a drop in replacement for MySQL its easy to install, just remember to back up all your databases first as you need to uninstall mysql first and it will kill /var/lib/mysql.

Following these instructions you are up and running in no time:
https://kb.askmonty.org/en/installing-mariadb-with-yum/

There are some nice features available in MariaDB like row locking backups but for the most part you will never notice the difference, version by version comparisons have MySQL slightly ahead on speed but with recent versions MariaDB can actually be faster when tuned correctly for your db type.

For example normalised tables can use sub query caching and large tables can use the first result stop functionality.

Over all from what I have seen of it I like it and will certainly be looking further into it for my own projects.

Posted in Managed Hosting | Leave a comment

WordPress update 3.5.1

There is a new Maintenance and Security Release for WordPress. The new version is version 3.5.1.

It fixes 3 quite serious issues :

  • A server-side request forgery vulnerability and remote port scanning using pingbacks. This vulnerability, which could potentially be used to expose information and compromise a site, affects all previous WordPress versions.
  • Two instances of cross-site scripting via shortcodes and post content.
  • A cross-site scripting vulnerability in the external library Plupload.

More information can be found here – http://wordpress.org/news/2013/01/wordpress-3-5-1/

Posted in Managed Hosting | Leave a comment

Package manager Composer

The problem

You have devoted many hours of your time to a particular PHP framework; you understand all of it’s cool libraries (database interaction, form builders, API integration etc) and can build projects quickly using this framework. Now, for whatever reason, you need to use a different framework, or perhaps can’t use one at all, for your new project. Assuming the former, you now have to learn all of their cool libraries, whilst assuming the latter, you must cope without any cool libraries at all. Suppose a world where you can start a new project, and choose which libraries you wanted to use, independent of any frameworks. Consider them project specific libraries that you can drag in to your project.

The solution

Composer is a package manager. It allows for developers to create and share packages, allowing other developers to plug them in to their project. It comprises a large repository of packages, whose popularity can be measured by its number of current downloads.

How it works

Put simply, you install PHP packages from the command line directly in to your current project directory. The package will only be available to that specific project. For those familiar with Drupal, this is the equivalent to installing modules on a per project basis, and in the same way that Drupal modules can be dependent on other modules, so too can Composer packages. Composer can automatically handle these dependencies for you.

Vs. Pear

Traditionally, PHP relied on its veteran package manage PEAR. The difference here is that PEAR will install it’s packages globally, meaning that the libraries become available to all projects on that server. Developing a PEAR package requires the code to be in a specific PEAR format, which is no good for libraries already written that would be worth giving to the community. As the community turns towards Composer, it is becoming rare to find a developer who actively uses PEAR, instead choosing a fully supported framework.

The future

Composer picked up a lot of support in 2012, and 2013 can expect to see an even wider spread adoption from the PHP community. In June 2012, there were 2,000 available Composer packages. In January 2013, there are now 7,000 packages. Similarly, in April 2012, Composer reported 200,000 package downloads for that month, compared to 2,000,000 (and counting) downloads for the month of January 2013. Given that many popular frameworks have already adopted Composer, it is sure to become an essential methodology for PHP developers.

Further Reading

Official website
getcomposer.org

Official package repository
https://packagist.org

Hands on introduction
http://net.tutsplus.com/tutorials/php/easy-package-management-with-composer/

Posted in Managed Hosting | Leave a comment

DTRX

DTRX is an open source application which is probably more useful for home/office users rather than server users, but I still found it a useful tool to have, to save time.

DTRX stands for “Do The Right Extraction”, all the application does effectively is act as a proxy between your request to extract a file and the actual extraction taking place. The application can handle archive files such as tar, zip, cpio, deb, rpm, gem, 7z, cab, lzh, rar, gz, bz2, lzma and xz.

This means that no matter what type of archive you download you can simpy use the dtrx application with it and you’ll always get the files uncompressed, you don’t even need to pass any flags with the application depending on the different type of archive.

Take the hassle out of uncompressing archives with dtrx http://brettcsmith.org/2007/dtrx/ :-) .

Posted in Managed Hosting | Leave a comment

Useful tips and tricks with VIM

While working with Alfresco files I had to search and find numerous strings within XML files. The task included replacing Hex Colour Codes, this was achieved by using the following Vim command:

:s/#code1/#code2
Replaces the first instance of #code1 with #code2

—–

:s/#code1/#code2/g
The above command replaces all code in the file with code2 within a single line.

——

By adding :% the whole file will be searched.
:%s/#code1/#code2/g
This command replaces all code in the file with code2 within the whole file.

—–

:%s/#code1/#code2/gc
Replaces all code in the file with code2 within the whole file but asks for confirmation first.

—-

:50,$/#code1/#code2/gc
Replaces the the code after line 50 to the end of the file

Posted in Managed Hosting | Leave a comment