Home Linux Community Community Blogs

Community Blogs

Preventing unauthorized SSH access using Denyhosts

Once when I was doing a regular tail -f /var/log/messages, I came across a number of messages like these.

sshd[29924]: PAM_NAM: User donk unknown to the authentication module
sshd[29924]: Failed password for invalid user donk from 'IP address here' port 63410 ssh2

My SSH was under continuous attack! . Hmm.., until I found DenyHosts..

DenyHosts is a cool little python script by Phil Schwartz, which will parse the logs and identify repeated authentication failures and add the IP address of the offenders to /etc/hosts.deny, thus preventing them to connect to the server in the first place.


As the program was not available in the official repositories for SLES 10 SP1, I had to do some manual configuration. The installation steps were detailed in the ‘Readme.txt' file within the package.

First, the python-devel package has to be installed. It is not installed by default

zypper install python-devel

Download the latest version of DenyHosts from

The version available at the time of my setup was 2.6. After uncompressing the sources

tar zxvf DenyHosts-2.6.tar.gz

cd DenyHosts-2.6

python install

The above step install the scripts and config files in /usr/share/denyhosts and in the site-packages of the python directory.


Before proceeding the file denyhosts.cfg must be edited to suit the installation environment.The example config file is fully commented so it should be easy to follow. I had the following config


SECURE_LOG = /var/log/messages
HOSTS_DENY = /etc/hosts.deny
LOCK_FILE = /var/run/

After this, I did the following step (as mentioned in the readme) to run denyhosts as a daemon during system start.

cd /usr/share/denyhosts

chmod 700 daemon-control

ln -s /usr/share/denyhosts/daemon-control /etc/init.d/denyhosts

/etc/init.d/denyhosts start

tail -f /var/log/denyhosts # will contain messages related to the start

If it is working as intended, enable it to start automatically by doing

chkconfig denyhosts on

It had happend occassionally that some valid IP's are listed in /etc/hosts.deny. To prevent this, the genuine IPs from which users connect can be added to a file called ‘allowed-hosts' in /usr/share/denyhosts/data. There is no specific format. Just add the IPs to the file one below the other. Also, edit denyhosts.cfg to change the following variable and restart denyhosts.


That's it..


Using lftp to synchronize folders with a FTP account

lftp is a powerfull FTP client than can be used to sync with a remote account. In Ubuntu 9.04 it is already installed so all you have to do is figure out how to use it. :)

First, you'll need 2 "scripts", one to download files from the remote FTP server to your computer an one to upload them from your computer to the server.

Download script:

Create a file named download.x with the following content:

open -u user,password -p [port] [server]
mirror -c -e /remote_directory /local_directory

You will need to write your username and password; also specify the port, usually 22, and the server address (eg: - you can also use sftp://). Also insert the absolute paths to the remote and  local directories.

The effect of the option -e  in the second line is that files that don't exist anymore in the remote directory will be deleted from the local directory; you may want to change this if you don't need this option.

Upload script:

open -u user,password -p [port] [server]
mirror -c -e -R /local_directory /remote_directory


There are only a few things changed in the upload script: the -R option is used because we want to upload from the local directory to the remote one. Also note that the order of the two folders changed from the download script.

There are many other options for lftp; just, you know, man lftp.

Now, to download the files from the remote FTP server to the local directory open a terminal an type in:

$ lftp -f download.x

Note: if the download.x file is not in your home directory, you'll have to write the path to it.

To upload the files to the remote directory use the command:

$ lftp -f upload.x


Hope this helps.


A secure remote folder share while traveling

Task, provide secure access to your home fileserver via Internet.

Instructions are geared for Debs...Ubuntu/Debian based systems.

Task: Set up a file repository on your home webserver, so that you can access your files anytime you are out connected to a hotspot, coffeeshop or through tethered your ATT phone via bluetooth* (*see my other blog for how to do this one)

Add ssh to your webserver, apt-get install sshd

Harden SSH so that it is more secure, 1. change the default port, 2. disallow root access.  3.  specify only needed users.

1.. For security reasons, we move ssh from port 22 to something higher 10022 for example.. (network scanners are less likely to find you and attempt to break in via brute force username/password attack)

You do this by modifying /etc/ssh/sshd_config and change the statement "port 22" to a port number above 1024 and below 65535,       port 10022

2..  then change "PermitRootLogin yes" to "PermitRootLogin no"

3.. Then add a statement that restrictrs who can login, keep it minimal like this: AllowUsers foo1 foo2

Restart ssh like this:   sudo /etc/init.d/ssh restart


now, test your login by attempting to log in ssh as root...should be denied, then try your login name that you spec'd up above in #3

works?   good, so far so good...

Now move along to your laptop, Eee machine, or whatever you carry around with you...  On Ubuntu, go to "places" then "connect to server", then select SSH from the service type pull down menu.  In the Server box, type in your webserver's IP address, in the PORT box put in whatever you used in   #1 above.

Username will be your user name that you allowed in #3 above.

Then when you hit connect, it will prompt for a password, which you enter, and then choose to "remember it forever".

Once this is done your "remote" folder will show up in your file browser, to pull files from your home server as needed to your remote device.





Been using Wine to play some old windows games.

Strangely these 6 to 8 year old commercial windows games run better under Linux using wine than they do on a fairly modern Windows XP box.

 Myth, Starcraft, SimCity3000 all just worked on Linux.

 I have also found Linux based Clients for several games that work perfectly. The quake 2 and 3 source code compiled very easily on my system and worked flawlessly. 

 Source code for Quake 2:

I had to install a few libXxx* libraries to get the code to compile.  To get sound to work I had to edit the Make file and have it build sdlquake and run that version to get sound working.  

 I used a couple of set commands to set the 1280x800 resolution of my monitor.: 

quake set r_customwidth 1280 set r_customheight 800 set r_mode -1

Source code for Quake 3:

Here is an interesting site that analyzes Quake source code:

The following site has a lot of installers for many older games to run them under Linux: 

 I also got the Linux version of Myth II Soulblighter installed using these directions :

 I haven't played any of these games for years, one nice thing about being unemployed.  :)

 Happy gaming.


Shovelling Up the Mess left behind by Windows

One of the interesting uses for Linux nowadays is to re-gain compatibility with older Software and Hardware. As far as software is concerned, this can be broadly divided into two groups: Applications and Entertainment (mainly Games).

Application Software

There isn't much call for running older applications beyond a certain age amongst home users; and any company who knows what it is doing will migrate it's data and document formats to newer versions of software as they are procured.

If an application vendor goes out of business, an older version of their software can continue to be run unless there is a conflict with a newer operating system, at which point Virtualization or Emulation can take over.

Of course, the ideal solution is to convert the data into platform-neutral formats, with standards defined by a consortium. This gives businesses the assurance that they will still have access to their data through solutions from competing vendors.

Entertainment Software

For many years now, some of us have been using (or writing!) emulation software to play our favourite titles from platforms which are now consigned to history. We accept that this workaround is needed, as the hardware and operating system we are using now are so radically different to what the title originally required. There is a label describing the hardware / software required to run the title, such as 'ZX Spectrum', 'Amiga' or 'Atari ST', and we can therefore accept that we need an Emulator to run on 'Linux', 'Windows' or 'Wii'.

But 'PC' games are a different matter. There is a whole slew of older games for DOS / Windows 3.x / Windows 9x that will just not run on NT-based Windows, or require work arounds to work, often performing poorly in the process.  Worse still, x64 Windows systems do not have the NTVDM necessary to run DOS programs at all, and support for 16-bit Windows applications has been removed entirely.

 The fact is that nowadays, an x86 / x64 Linux machine equipped with Dosbox, Dosemu, and Wine (including the propreitary forks such as Cedega and Crossover Games) is in a far better position to run these older games than a modern off-the-shelf Windows system. Dosbox also contains an x86 CPU emulator, and can thus be run on any supported architecture.


Modern Windows systems are great for supporting new hardware; Linux less so. You buy a £5 webcam today, and it's virtually guaranteed to work on Windows (and 'Mac' if it is written on the label). If the hardware manufacturer is serious about supporting Open Source, and release specifications and / or reference drivers, than you are fairly likely to find a means of getting the device to work on Linux or other Open Source systems.

Otherwise you are relying on a talented person who knows the necessary driver API's (either kernelspace, such as v4l, or userspace, such as SANE) and is prepared to experiment with the hardware, either for their own use or for a bounty, to write the necessary driver support. However, once the support is in Linux, it usually stays. Even if the API's change, there are usually  enough interested maintainers who will modify the code as necessary to make sure that older drivers still compile.

This is in stark contrast to Windows, where all you get from the manufacturer is binary blobs, a few mystic 'sys' and 'inf' files. If Microsoft change their kernel ABI, or port Windows to a new architecture (as x64 arguably is, as you cannot mix x86 and x64 code in a process, least of all the kernel!) then older drivers will break. If the manufacturer has gone out of business, or wishes to not support a device, the end user loses.

Microsoft have also made this situation worse - all drivers installed onto Vista x64 SP1 have to be signed, effectively ending any hopes at writing Kernel-Mode drivers by enthusiastic Windows users (were there ever any?), or companies who don't consider it economically viable to pay WHQL fees to audit the drivers for older devices, when they could just sell a new device.

I have a specific example here - my mother was using a 'Packard Bell'-branded Mustek scanner under Windows. Support for these drivers was dicy under Windows XP SP2 and onwards, often requiring her to reboot after scanning a single page before she could scan another. However, under SANE on Linux it works perfectly under the 'gt68xx' backend, and is likely to do so for many years to come. Yes, the software is more complicated for her to use, but she is relieved that it works flawlessly, and is grateful it will continue to do so.


I'm hoping that Microsoft's power-hungry grab for the brand-label of 'PC' ("Personal Computer") to describe their particular Operating System running on a particular CPU / Hardware architecture will be a part of their undoing.  The fact that in earlier days people were attracted to the 'PC' brand for the large software and hardware compatability will prove that they cannot even be compatible with their own standards, which the community has had to provide.

It's not just Microsoft though; by clinging to the brand name of 'Macintosh', Apple has made a similar rod for their own back. If you were using OSX prior to 10.5 on a PPC machine, thanks to the 'Classic' emulator you would be guaranteed to be able to run almost any 'Macintosh' title. However, the shift to 10.5 and Intel processors has ended their 'support' of this. Personally, i  believe the label 'Macintosh' to be highly irrelevant in describing modern Apple systems.


Having now forgotten the enthusiasts that put them where they now are, Microsoft and Apple haven't banked on them still being around, and being able to influence the decisions of non-enthusiasts. We want to carry on running older games. We don't want to have to ditch perfectly good peripherals due to an upgrade of an operating system.

As those PC users (who know enough to be able to support themselves) try to use their older peripherals and software and find that they fail to work on Windows, yet work fine in Linux, the knowledge and reputation of Linux will increase. So the future is actually kinda bright for us, at least in this area!

Although it might seem like flame bait to suggest it on a Linux site, i'm kind of hoping ReactOS achieves good XP compatability and look and feel. Users are reluctant to change, and at least they can carry on with their Windows habits whilst  embracing the freedom that Open Source software brings.

 Richard Foulkes is not a Technical Journalist. He holds no fashionable Industry qualfications and has never spoken at a conference or even been to one. He makes no claim of being an IT professional. He is also not currently employed.


First post, or my Linux experiences

Since this is my first post, I thought that I'd just write a bit about what I'm doing here on and Linux at all.

My first experience with Linux came several years ago, when I was still very naive about computers and thought Windows was the best thing since the computer itself. I found Red Hat 7, got it downloaded, and installed it on an old box. I was actually pretty decent with it. Then, I forgot the root password, and half-formatted the drive while trying to install Debian in all its 22-disc glory. (I would later format another drive trying to install Debian, as well.)

Later, I switched over from Windows - partially due to my frustration with its slowness - to Mac. (My dad is still a Mac fan to this day.) I loved it, and it was on Mac that I got my first real experience with Python for Real Programming. (Another computer - this one a Power Mac - nearly succumbed to my Debian disk formatting during this period, but fortunately it wouldn't read the discs.) Just for fun, I decided to set up a Web server on a ten-year-old computer I would later call Adelie. I installed a distribution on it called "Ubuntu Server," and due to my Mac experience with the command line, had little trouble configuring it.

A month or two later, I built a kick-awesome desktop tower named Rockhopper. Dual-core processor, 1760M of RAM, and 500GB disk. Based on my previous experiences with the Ubuntu Server, I decided to dual boot Windows XP Pro and the standard Ubuntu desktop. I ended up using Ubuntu full-time, and recently purged XP from the computer to reclaim the full 500GB.

Eventually, I traded out GNOME for Xfce, installed Ubuntu on a laptop too, and set up GoboLinux on some random box. (I never realized how long compiling from source can really take.) In fact, one of the things I love about Linux is how changable everything is. seems like a really good idea to me. I'm hoping that I'll be able to share my expertise beyond the Ubuntu forums, and write some interesting blog articles. It's getting late where I live, so I'll finish for tonight, but I'll probably try and do something on here tomorrow.



I have been waiting for this.

I use to be a "regular" of

The gestation was long (it's always like that when you wait...), but now I appreciate why!

Bravo for all the work - this new web site is an impressive achievement.

Im'happy, because I've become free - totally free - when the last barrier - an iMac - fell 2 weeks ago. It was replaced, at my request to my employer, by a System76 machine running Ubuntu. So now, at work and at home, I'm free! My wife and I run Ubuntu on all our computers - and we're happy with it!

And all this because some guy, in Finland, some 20 years ago, decided he too wanted to be free!

Need I say more... ?


So Tragic

Ive collected some cute things when I was in Linux World expo last time. I have this rubber penguin keychain and used to hang it on my beg.

Isnt it cute?






Isnt it tragic :(

I brought it everywhere I go, and many times Ive realised it was stucked inside the car when I closed the car door, the building door and everything else. Will find a new one :) 



Testing mail servers with swaks

Article Source:
Date: April 16th 2009

 I hadn’t seen this tool before so I figured I would share. Swaks is the swiss army knife SMTP according to the homepage.

Full Entry


A Microsoft problem, or a "Me" problem

I'll be the first to admit that I probably should know Windows better (being in the IT field, after all) than I should, but sometimes I just have this feeling of "why should I when I know the linux way better?" Of course, not everything is that black and white, but perhaps an example is in order.

Today, I was at a friend's house, and they wanted help backing up a hard drive. Now, by some odd piece of work, it was a SATA drive in which the enclosure didn't work for some reason or another. Now, I do have an IDE/SATA to USB converter, so I offered to rip the drive to an ISO for them. They declined and just hooked the hard drive up to their computer. Alas, the drive spun up but was not detected by windows, and their tools apparently won't rip ISOs if it can't detect the drive. So, I offer again to try and manage to detect the drive through "dmesg | tail", but cannot mount it. Either way, that didn't stop me from dd-ing the drive, but the hard drive space of my netbook was definitely a limiting factor.

Since I was going to be there for a while, I plugged in the netbook and had them share a folder on their computer and had it mounted a second later (after apt-getting smbfs from the ubuntu repositories) across the network. After that, it was relatively smooth sailing until it finished (though, I did run ` watch -n 1 -d "ls -alh backup.iso | cut -d ' ' -f5" ' in another window to keep track of the general progress of the process).

Of course, at this point I am stuck with the thought of "most of these tools are common tools on linux", and immediatly follow it up with the thought "why can't windows have such tools on hand?". Now, I grant that similar tools probably exist on Windows, but why would I want to have them when I know that they are built-in on linux?

To me, linux just isn't about being pretty; it's also about having funtion, and I really appreciate the days when a tiny laptop that I just use for surfing the web and chatting with friends comes in and helps solve a big issue, while the "big windows computers" are forced on the sidelines, due to lack of functionality. Why do I derive such enjoyment from such a thought? Perhaps it's because I always hear "oh, but linux isn't compatible" and "but linux can't do that" all the time, and it's nice to actually have the power to show them that linux isn't just for big servers locked up in a room somewhere.


What to do when you can't ping a computer

What do you do when you need to check the status of a computer / device on your network that doesn't respond to pings?

A lot of computers / devices on a network are configured to ignore ping requests. This can sometimes leave you wondering if they are really offline. 

internet:~ # ping
PING ( 56(84) bytes of data.

--- ping statistics ---
5 packets transmitted, 0 received, 100% packet loss, time 4009ms

So how do you ping something that doesn't reply?

Read more... Comment (0)
Page 122 of 143

Upcoming Linux Foundation Courses

  1. LFD320 Linux Kernel Internals and Debugging
    03 Nov » 07 Nov - Virtual
  2. LFS416 Linux Security
    03 Nov » 06 Nov - Virtual
  3. LFS426 Linux Performance Tuning
    10 Nov » 13 Nov - Virtual

View All Upcoming Courses

Who we are ?

The Linux Foundation is a non-profit consortium dedicated to the growth of Linux.

More About the foundation...

Frequent Questions

Join / Linux Training / Board