Linux.com

Community Blogs



Going against the grain

For me, I have been an advocate of Linux from the first day I set eyes on Redhat 4.2 at the age of 14. I had my new PC for less than 6 months when I decided to wipe Windows95 from the disk, and install RedHat with its awesome NeXTSTEP window manager.

That was the first day I started to argue the point about the difference between Windows and Linux. Sure, Windows had loads of software, but Linux had so much potential, but still living with my parents meant that they wanted to use Windows if they needed to use a computer, they could not possibly use Linux.

It continues like this from thereonin. At school, I was the odd one, the teachers concerned that I may do something which in their eyes would be the equivalent of digital armageddon.

For me, Linux really took off when I discovered Usenet, where I started to discover more and more applications, scripts and more distributions of Linux, where I quickly moved onto Debian.

When I went to college, I found myself doing a course in IT in which the IT lecturers, and the IT support staff had not ever used anything other than Windows. They were stuck in their ways, only teaching and supporting Windows and Microsoft technologies.

Despite arguing my advocacy for Linux for several months, and requesting that I used an alternative to Windows, I was alway shunned to the point where I was warned that if I didn't conform, I would be kicked off the course.

Being restricted like this wasn't good for me, I soon dropped out, wanting to find a way to stretch my wings, and teach myself what I wanted to learn. I felt that Linux was certainly the way to go.

It was certainly the right thing to do. In every job I have had, I have brought in Linux. Each time, it has been the same. I promote, people seem me as being a bit odd, suggesting time after time that we should use Linux for X. Eventually wearing them down to the point that I get to do it, and every time, I have managed to deliver and beyond, whether its because it was more forgiving with some questionable hardware, or whether it was that it 'just worked'.

For me, my biggest win was working at an eduational establishment where the majority of the infrastructure was apple-based, with their 'crashproof technology', and their 'it just works' motto, I loved the fact that over six years I took critical services away from OSX Server to linux, seeing server uptimes over a year on Linux hardware, compared to the weekly reboots required by their fruity counterparts.

I found however, attitudes change when I became a developer. Once I start working with people who embrace technology, and don't sit on the rigid rails of Microsoft brand software, it was easy to convince people to use Linux servers, and also the benefits of using it day-to-day as a desktop, a staging system, and the basis of every new project I develop. Now, I am Senior Developer for a multi-million pound company, one of the fastest growing tech companies in the UK, and one of the top growing 200 tech companies in the EMEA. I put a lot of faith in the tools I use, and they have never let me down.

Even after 16 years, I still get strange looks, my wife still refuses to use Linux, and I can easily empty a room just by showing my passion. Despite all this, I continue on, promoting Linux, promoting Opensource technology, and always being there if anyone wanted help making the same change too.

 

My Nerd Story: Learn By Doing

My nerd story started in the winter of 2007, when I was 13 years old. And it all started from a simple challenge from a friend to look at some HTML tutorials, which I did. And I was captivated from the first second. I started doing very simple things like writing text and changing the background colour of a page, you know, beginner's HTML stuff, but I found it a lot of fun. For some reason, this stuff was second nature for me from day one.
Read more... Comment (0)
 

My Nerd Story: Ham Radio, Atari, and UNIX

My geek story started early, probably because my dad and grandfather were into amateur radio (ham radio) in a pretty hard core way from the time I was little. I remember my dad studying for one of the morse code exams when I was maybe 4 or 5 years old, and me being the little sponge that I was, picked it up pretty easily. Nothing like a mouthy toddler shouting the answers to motivate someone to learn.

Read more... Comment (3)
 

New Year's Resolutions for SysAdmins

Ah, a new year, with old systems. If you recently took time off to relax with friends and family and ring in 2014, perhaps you're feeling rejuvenated and ready to break bad old habits and develop good new ones. We asked our friends and followers on Twitter, Facebook, and G+ what system administration resolutions they're making for 2014, and here's what they said. 

Read more: New Year's Resolutions for SysAdmins

 

 

 

 

The Electric Grid Vehicle

Made a sketch of what I consider to be an interesting future transport development. The idea is that you wirelessly draw power from the road to charge electric vehicles on the move. With induction strips embedded in the road you wouldn't have to stop to recharge as previous as with fossil fuels. Available energy from the road also means the battery could be made cheaper and a lot less heavy.

The image displays a Grid Car on its way past an old gas station. The car is a hybrid for the longer distances between electrified roads.

With this I would like to show that an electric vehicle future might not be so bad after all. Of cause an electric car future would mean more IT so more opportunities for Linux.

GridCar

 

Password guessing with Medusa 2.0

Medusa is my password forcer of choice! Mainly because of its speed. If you're hoping to try it on a Windows box, sorry you're out of luck. As far as I know, there is no Windows port. In which case you're next best alternative is Hydra. See last week's post found here. Medusa was created by the fine folks at foofus.net, in fact the much awaited Medusa 2.0 update was released in February of 2010. For a complete change log please visit http://www.foofus.net/jmk/medusa/changelog Medusa is a command line tool, as far as I know there is no GUI front end. But don't let that scare you, it's super simple to operate. The foo magic of compiling from source is the hardest part. Although if you're running Ubuntu, Medusa is in their repository. Starting with Ubuntu 10.10 Medusa packages were updated to latest 2.0 release. If you're a Fedora fan boy, good news; Medusa RPM is available. With Fedora 16 Medusa was updated to release 2.0. Anything prior will use Medusa 1.5. Other distros may have to compile from source. Compiling Medusa from source: 1. Download Medusa 2.0 source from foofus.net 2. Decompress tarball tar -xvf medusa-2.0.tar.gz 3. Perform usual compile foo magic ./configure make make install One word of caution. During the ./configure process a module check is performed. If dependencies have not been met, Medusa will not support those modules. You'll have to ensure all dependencies are satisfied before running make and make install. Have a look here if you run into trouble http://foofus.net/~jmk/medusa/medusa.html Installing Medusa from Ubuntu Repository: 1. apt-get update 2. apt-get install medusa Basic password guessing with Medusa: If you'd like to see all Medusa options, execute medusa with no switches. If you'd like to see all supported modules execute medusa -d In its most basic form Medusa requires the following information: 1. Target host 2. User name or text file with user names 3. Password or text file with passwords 4. Module name For example; If I want to try a single password guess of abc123 against the Administrator account on a Windows box with an IP address of 192.168.100.1 medusa -h 192.168.100.1 -u Administrator -p abc123 -M smbnt In a Windows environment the Administrator account is special in that it is the only account which cannot be locked out. Although watch out, some environments remove this feature. Before you brute force accounts ensure you know the lockout policy. But let's pretend in this example the Administrator account does not lock out. This means I can attempt as many password guesses as I'd like. In this case I'd download a pre-compiled password list. Then, let Medusa loose and wait. medusa -h 192.168.100.1 -u Administrator -P passwordlist.txt -M smbnt Depending on the latency between you and the target host, limiting concurrent attempts may be a good idea. This can be accomplished with -t or if you'd like Medusa to stop after first succesful username, password combination use -f Medusa is simple, fast and effective. I especially love the number of modules it supports, including web forms. How many times have you wanted to password guess a web site login? With Medusa it is possible, simply provide the proper URL. Medusa even supports SSL and if your target is using security through obscurity by using a non standard port, Medusa supports that too. Specify non standard ports with -n Administrators should be auditing passwords regularly. Weak passwords are your number one concern. If you allow users to generate a weak password they will. You're best bet is to implement a good password policy and enforce it. For more information please visit our blog at: www.digitalboundary.net/wp
 

Squid and Digest Authentication

This week I want to review Digest authentication, which is a step up from Basic proxy authentication, not the best choice but an improvement. Digest Authentication hashes the password before transmitting over the wire. Essentially it sends a message digest generated from multiple items including username, realm and nonce value. If you want to know more see (RFC 2617). Thing to remember is both Basic and Digest are on the weak end of the authentication security spectrum. If your only choice is Basic and Digest, the lesser of two evils is Digest. Digest is very similar to Basic from a configuration perspective. Squid uses an external helper program to facilitate the authentication process. From a Squid configuration perspective, the following pieces are required in the “OPTIONS FOR AUTHENTICATION” section of squid.conf auth_param digest program auth_param digest children auth_param digest realm auth_param nonce_garbage_interval auth_param nonce_max_duration auth_param nonce_max_count The following parameters are similar in nature to Basic authentication; auth_param digest program - provide location of external helper program auth_param digest children – number of spawned processes to facilitate user authentication requests auth_param digest realm – string presented to user when authentication appears on screen Digest authentication introduces the concept of a ‘nonce’ (number used once). This is a generated value (in this case generated by Squid). The client uses this value in conjunction with the password during the hashing process. Without nonce-salting, captured hashed passwords could be replayed. The ‘nonce’ value is regenerated at specified intervals to ensure its continual uniqueness. auth_param nonce_garbage_interval – Specifies how often Squid should clean up its nonce cache auth_param nonce_max_duration – Specified how long the nonce value remains valid auth_param nonce_max_count –Places a limit on how many time a nonce value may be used The last piece of this puzzle is a database of valid users and their associated password. Typically this information is in a hashed text file stored on the Squid server. You should know, Squid does not offer any capabilities for managing it, most users generate it manually or utilize scripts. On an Ubuntu based Squid server the Digest Helper program is located in the following location; /usr/lib/squid3/digest_pw_auth Given above configuration paramaters, the final product should look like this; auth_param digest program /usr/lib/squid3/digest_pw_auth –c /etc/squid3/password-file auth_param digest children 5 auth_param digest realm My Realm auth_param nonce_garbage_interval 5 minutes auth_param nonce_max_duration 30 minutes auth_param nonce_max_count 50 Don’t forget you must adjust Squid ACL’s. The procedure is identical to Basic Auth reviewed last week. Regarding the password file, it should be hashed to keep prying eyes off user passwords. By the way “-c” in above program parameter means you’re specifying the location of a hashed password file. This concludes Digest authentication, don’t forget to restart your proxy server. Next week I’ll talk about NTLM authentication, since most of you are using Windows networks. To find out more visit: www.digitalboundary.net/wp
 

Squid and Basic Authentication

This is perhaps the easiest authentication helper to configure in Squid, but also the most insecure. The biggest problem with Basic is it transmits username and password in clear text, hence very susceptible to network sniffing or man in the middle type attacks. The only reason I’m writing about it is it’s a valid authentication mechanism in some limited circumstances. Secondly I want to show you how authentication has evolved over the years. Ultimately you want to Kerberos authentication with your Squid proxy, but before we got there we had basic. And here is how to configure it; First thing that requires out magic touch is Squid’s configuration. Locate and navigate squid.conf The first section you’ll come across is for configuring authentication. It’s called; # OPTIONS FOR AUTHENTICATION # ----------------------------------------------------------------------------- You’ll notice there are many comments in this section explaining all the different options. But let’s jump ahead to what we came here for… Locate the following lines; note they will be commented out. Enable them by removing the hash character ‘#’ auth_param basic program auth_param basic children 5 auth_param basic realm Squid proxy-caching web server auth_param basic credentialsttl 2 hours If you haven’t noticed already the first parameter auth_param basic program configures the location of an external helper program. This helper program is named pam_auth and on an Ubuntu system is located in the /usr/lib/squid directory. In fact all authentication helpers are located in this directory. Therefore our first line should look like this; auth_param basic program /usr/lib/squid/pam_auth Next we have the children parameter. This configures the specified number of processes to handle incoming authentication reuqests. In above example pam_auth will spawn 5 separate processes to handle all authentication requests. Anywhere between 5-10 helper processes is a good starting point. If Squid runs into trouble, it will tell you in /var/log/squid/cache.log , monitor this file closely. Then we have a realm parameter. This is a string which is presented to the user when the authentication prompt appears on screen. With Basic authentication this is an arbitrary string value. You can use anything, like; “Welcome to my really cool Proxy Server. Enter your Username and Password” Lastly we have the credentialsttl parameter which dictates how long Squid caches authentication requests internally. Keep in mind a small value increases Squid load, while a larger value will reduce it. You may need to play with this if you notice your Squid box is really busy. The last piece to this puzzle is enabling Squid’s authentication ACL. This includes changing two additional parameters. ( ACL & HTTP_ACCESS). The default ACL bases access or no access on client subnets. ACL LOCALNET SRC 192.168.0.0/24 is an example of one. To enable authentication, comment out above default ACL and replace with this; acl authenticatedusers proxy_auth REQUIRED Lastly enable above access list, named authenticatedusers http_access allow authenticatedusers That’s it. Restart Squid service and you should now be prompted for user name and password. You session will be authenticated until you close your browser. www.digitalboundary.net/wp
 

Steps to Hosting a Web Site on Ubuntu 11.04

1. install apache2 the go to /etc/apache2

2. cd sites-available

3. cp default  new_web_site_name_file

4. make the changes as follows this is for Name virtual hosting

NameVirtualHost IP  e.g.x.x.x.x
// this would already be there just change *

ServetrAdmin your mail id //this entry would also be there just change mail id

add the following three entries according to your need

 DocumentRoot /var/www/Your dir name/ // web site's root path
 DirectoryIndex login.php //default login page entry
 ServerName your web site address.com //ur web site address

let other entries be there

save the file.

 

run the following commands as root user or use sudo

a2ensite name  new_web_site_name_file (file name used in step 3)

the restart the apache web server

/etc/init.d/apache2 restart/reload


you can acces the web site aftre making the necessary cahnges in the /etc/hosts file on your system or acess it on the interanet/internet after getting its entry done in the DNS

 

Regards

Harkamal Dadwal

 

 

 

 

 

 

First Post

Well ive been doing some reading trough the forums here aswell as the blogs for a while, about time for me to write something i guess :)

I actually have some questions someone hopefully can help me out with, atm my laptop is running Linux Mint11 with working proprietary drivers for the ATI gfx.

But i have also tried the Fedora livecd and i must say that i really like the interface, not to Unity-ish but instead nice looking and easy to use, but i tried a fresh install and ended upp with the black screen of death so i gave up.

Also had a look at Opensuse, but i cant really decide if i should do the switch from Ubuntu or Debian based if i should go Fedora or Opensuse, any pros and cons that can be good to know about them.

Or should i go with the one that feels best, i want a nice community if help is needed along the journey and ive bumped in to some really unfriendly ones during the years =D

 

SPI Board and Officer Elections - 2011

Software in the Public Interest (SPI) is pleased to announce the results of
the recent board and officer elections.

Board elections were held from July 14-28 2011.

The board terms of David Graham and Jimmy Kaplowitz expired at this
election. In addition one board seat was vacant at the time of the election,
for a total of three available seats. David Graham chose not to stand at
this election and has retired from the board. SPI would like to thank David
for his participation on the board from 2004 - 2011. Clint Adams, Robert
Brockway, Jimmy Kaplowitz and Trevor Walkley stood for election. Jimmy
Kaplowitz was reelected to the board and Clint Adams and Robert Brockway
were newly elected to the board. SPI would like to thank all candidates for
their participation in the election and congratulate Clint Adams, Robert
Brockway and Jimmy Kaplowitz for their election to the board.

The current directors are:

    * Bdale Garbee
    * Joerg Jaspert
    * Jonathan McDowell
    * Michael Schultheiss
    * Clint Adams
    * Robert Brockway
    * Joshua D. Drake
    * Jimmy Kaplowitz
    * Martin Zobel-Helas

Officer elections were held at the board meeting on August 10, 2011. All
existing board members were elected unopposed.

The officers for 2011-2012 are:

    * President: Bdale Garbee
    * Vice-President: Joerg Jaspert
    * Secretary: Jonathan McDowell
    * Treasurer: Michael Schultheiss

SPI associated projects include:

    * ankur.org.in
    * aptosid
    * Debian
    * Drizzle
    * Drupal
    * freedesktop.org
    * Fresco
    * Gallery
    * GNUstep
    * GNU TeXmacs
    * Jenkins
    * LibreOffice
    * madwifi.org
    * OFTC
    * OpenOffice.org
    * OpenVAS
    * Open Voting Foundation
    * Open64
    * OpenWrt
    * OSUNIX
    * Path64
    * PostgreSQL
    * Privoxy
    * The HeliOS Project
    * Tux4Kids
    * Yafaray

Software in the Public Interest, Inc. is a not-for-profit corporation under
the laws of New York State.
 

Read more... Comment (0)
 
Page 2 of 15

Upcoming Linux Foundation Courses

  1. LFS426 Linux Performance Tuning
    08 Sep » 11 Sep - New York
    Details
  2. LFS520 OpenStack Cloud Architecture and Deployment
    08 Sep » 11 Sep - Virtual
    Details
  3. LFD320 Linux Kernel Internals and Debugging
    15 Sep » 19 Sep - Virtual
    Details

View All Upcoming Courses


Who we are ?

The Linux Foundation is a non-profit consortium dedicated to the growth of Linux.

More About the foundation...

Frequent Questions

Join / Linux Training / Board