Linux.com

Home Linux Community Community Blogs

Community Blogs



My Nerd Story: Class, queerness and the transformative nature of technology and open source.

Beth 'pidge' Flanagan is a Senior Software Engineer for Intel and spends most of her time working on Open Embedded Core and the Yocto Project, mainly as the release engineer and maintainer of the yocto-autobuilder. She is also a geek, a queer trans woman, a motorcyclist, and a practitioner of random bits of general purpose geekery. She has been working in IT/software engineering now for the past 23 years.

I was born and raised right outside of Newark, NJ. My family was working class and I grew up in a working class neighborhood full of first and second generation immigrants from Ireland, Scotland, Brazil, Italy, Peurto Rico, etc. Basically, a neighborhood that most people wouldn't think of as a fertile bed for nerds. I tell people to basically imagine some of the more gritty scenes from The Sopranos and they'd get an accurate idea of where I grew up.

 

The Sopranos - Satriale's Pork Store

I realized at a very young age that I was a trans woman and that without a well thought out plan, I wouldn't be able survive the conservative confines of that world. This concept of needing to escape was further compounded by the fact that I was on the bottom of the social rung at school. I was bookish, had a serious lisp and a severe femoral torsion which caused to walk pigeon toed (hence the nickname I carry to this day) and a classroom full of boys and some of the girls who marked me as "different" from my first day at school and did not let up in their abuse for the entirety of my elementary school career.

When I was about 9 or so, I had a pretty good idea that all the praying in the world wouldn't make me not trans and that I should probably spend some time figuring out what to do about it. So, I petitioned my father for an adult library card (remember a time when 'looking stuff up' included a trip to an actual library?). I remember asking him if he would sign the papers for my library card and he handed me the largest book on the bookshelf he could fine,'The Crusades' by Zoe Oldenbourg. He told me "Read this and do a book report and I'll sign the permission slip". I read it in about a month or so and that signed permission slip opened up a world I could never have dreamt of.

That library was my salvation. In its stacks I learned, in carefully hidden books, that I could do something about being trans. For the first time I could remember, the serious depression I had been in since age 6 when I figured out that I wouldn't grow up to be a woman, not at least without a little bit of help, abated somewhat. The library became my second home. It was where I spent my  days, hiding from the world. I went into full on reading mode, devouring anything I could get my hands on, but always ending up back in the science row with it's miniscule amount of books on computer science. But, they did have an entire set of "The Art of Computer Programming". I flipped through it somewhere around age 10 and didn't understand one bit of it! Somehow though, I was strangely enamored with the idea that language could be turned into something that made machines do work.

I mentioned before that people generally don't think of working class people as a hotbed of nerdism. If anything, I think that the reality is the  exact opposite. When you grow up without a lot of money you end up learning how to make things last and fix things that need repair. My family was no different. My father was a fairly decent carpenter who tried, bless him, to teach me with absolutely no success. His mechanical skills were impressive, something I ended up being able to learn much later in life. My grandmother however taught me how to crochet. In crocheting I saw math and patterns and it taught me how patterns could create beauty.

When you're the kind of strange effeminite kid in a working class world that I was, you end up spending a lot of time alone and learn to quickly entertain yourself. One summer I spent a full week alone in my backyard with a roll of tin foil, a magnifying glass and a thermometer seeing what the highest temperature I could achieve was. That was also the year I built a boobytrapped for the backdoor to the house. (I was afraid of burglers). I forgot to unset it and it almost knocked my mother out when she opened it and a few of my brothers baseball bats came flying out, full speed, towards her face.

1982 came around and something happened that would change my life forever. It all started with two lines.

10 PRINT "I HATE SCHOOL!!!"
20 GOTO 10

I still remember those first two lines of code I ever wrote. It was a 10 year old kid's 'Hello World'. The Catholic school I attended had invited this computer education company in to do an optional computer class. I  begged my parents to let me take it. I remember the first day I stepped into that class. About a dozen or so Commodore PETs, with the ever so high tech audio cassette storage devices.

 

Pet4016

After the first few classes, you just stopped trying to load your prior work from tape at the start of class as it took forever to load. You got really good at remembering what you did the week before and learned to type quicker than the audio tape could load. I ended up
falling asleep at night listening to those tapes (SkreeeetchWoooooSkreeeeeeetch!); in love with the idea that you could store STUFF on tape other than music!

So, here I was, this kid who was absolutely on the bottom of the social ladder. I was despised by the kids at school and my ability to have control over my life was greatly impacted by overly protective parents, my age and obvious gendered behavour difference, but... for those 45 minutes a week in 1982, I had, for the first time in my life, actual agency. I could sit there and tell a machine to do whatever I wanted it to and the results were up to me. It wouldn't beat me up. It wouldn't make fun of me for the way I walked, or held my books. It wouldn't call me awful things. It would just do what I told it to do. (This generally entailed new and more complex ways of spitting out how much I hated school, to be perfectly honest.)

Those little two lines of code turned into a much larger program that year and my parents ended up trying to nurture the one thing I had shown an actual interest in. I'm still unsure of how my father afforded it, but one day he came home with a Timex Sinclair 1000, literally
the cheapest computer there was. I actually recall using it quite a bit, but, as the concept of needing to store things was a bit beyond my dad, who was a truck driver, he had neglected to buy the audio tape drive. I would have to leave it on for weeks with a note on it, telling people not to shut it off or I'd lose my program.

 

Zx81-timex-manipulated

But, no matter how much computers could act as an escape for me, there was still this huge thing I had to deal with and as I got older and the effects of puberty started to hit, my depression worstened. I stopped writing code in my Junior year of highschool and just focused on trying to make it through the day. By the time I hit university I was an absolute wreck from trying to deal with being trans. So, after the first year, I made the best decision ever. I quit and moved to Washington DC and was able to have space to figure out what my plan was.

I moved back home after about a year because I had gotten fairly sick. By this time, my mother had gone from being a secretary to getting a degree  in accounting to being a VP at a small software company. Behind my mothers back, I finagled a job there. I will always remember the engineering manager who risked her wrath to give her weird, green mohawk having kid a job. So, my lucky break came in 1991, at age 19, writing insurance software in MagicPC for 5 dollars an hour.

Eventually, I left to take a job at the local university. Here is where I encountered the second thing to change my life. Windows 95.

It was 1994 and we were previewing the beta of Windows 95 for a migration from Windows 3.11. I absolutely loathed it. There was no integrated TCP/IP stack. I was use to the Solaris command line by that point and it was still the clunky DOS shell. It was nothing I wanted and while it was an improvement, I wanted something more so I went searching for a better solution and found it in Slackware.

I don't remember the exact version of Slackware I finally got to install, but I know the kernel was around 0.99 (before loadable modules and ELF binaries!). It was like a dream and a nightmare rolled into one. When you got it working it went like clockwork, but it was an absolute TERROR to set up. Package management? Nope, tar.gz and make was your friends. I got really good at debugging makefiles.

But, I was hardly bored. I spent way too much time getting kernels recompiled, fighting with X11 settings on my Diamond video card, wondering why the NE2000 card in my box, when the box was the end node  of a token ring would blue screen all the Windows 95 boxes on the ring. Bored? I was too busy tearing apart this amazing thing that people had put together, in part, just for something cool to do.

It was magic. Here was this thing that didn't work out of the box! I had to actually sit there and figure it all out. That year and a half I spent learning the operating system inside and out gave me a sense of accomplishment, a sense of pride and a sense that if I could survive a Slackware install and make it out on the other end, a gender transition should be a piece of cake, right?

I had finally figured out the logistics of my transition and set a date. To put it mildly, the concept was sound, but the exectution went poorly. I lost my job, my family and the entire situation created a rift in my family that will probably never, unfortunately heal. So, here I was, age 24, with a brand new gender presentation, a high school diploma, a job history I couldn't use because it was under a different name. I had moved to Philadelphia and was living on a friends couch because I was kicked out of home. Things were not looking very positive.

But, there were a few things I did have.

I knew how to write code.

I knew Unix and Linux.

I was too damn subborn to take "No".

And I was left with no other choice.

I'm not sure how I got hired, I'm sure in part it was a bit of desperation on their part, but within the month, I ended up getting hired as a sysadmin, administering 250 AT&T BSD boxes that ran a computer based testing suite. I ended up working on porting the program over to Linux which got me hired into writing the next generation of that software.

From there it was on to trying my hand at UI design with stops in animation, power grid, control systems. And then, eventually, to my current home in the embedded world.

I look over the past 30 years since I first sat down at that old Commodore PET and am thankful. I had a mother who, despite our differences, firmly instilled in me the idea that women, even women like me, could do anything. I had a work ethic that instilled in me that as long as I could do the job, nothing else mattered. I had the stubborness to not believe the people who were telling me "NO!". I had the curiosity and the drive to figure it out for myself because I knew that no one was going to tell me how to do it.

My nerdcred doesn't come to me from a piece of paper, but by sheer force of will.  I know a lot of my collegues came to where they're at by the "traditional" route, university, internships, etc. I'm glad for them but I do not envy them a bit. While my route was the hard, tough slog, I would never trade it for the world.

I firmly believe that my past gives me a perspective in geekdom that is relatively unique. It has made me a better engineer than I think I would have been had I gone that traditional route. It has defined who I am and has made me a better person because of it. I can look at people from non-traditional nerd backgrounds and see their inner engineer. I've learned that sometimes, you find the most brilliant of people in the least likely of places. I approach new experiences, be they personal or technological without one iota of fear.

And lastly I always know that the first program I write whenever I learn a new language is going to be my own, special, personal version of the first two line program I ever wrote.

 

Configuring Apache2 to run Python Scripts

This is meant as a simple writeup to fill a gap in various "HOWTO"'s that I read when trying to setup my Apache2 server to process python scripts as CGI, though it would apply to any cgi scripts (perl scripts, compiled binaries...).
I've been developing for years (C, C++, PHP), but had never delved into python before, and I wanted to be able to have my scripts have a web interface.

The first step is getting Apache2 to recognize that my .py files were to be executed and not spit out as text files.
docs.python.org has some nice HOWTOs (http://docs.python.org/3.3/howto/webservers.html) on how to think about python and the web, and apache.org has mountains of documentation (http://httpd.apache.org/docs/2.2/howto/cgi.html) relating to running CGIs. What I didn't find was a simple guide on how to set it up. I'm a developer, not a sys-admin, and while I like knowing how to configure Apache and tune my linux boxes, sometimes I just want to get my webserver up and running and start coding.

So, in case anyone was going through the same situation as me, here is my quick and dirty setup.

For reference, this setup was done on Ubuntu 13.10, using ubuntu's default apache2 installation, and python3.
I'm also assuming you know how to configure apache for a basic html site. There are lots of HOWTO's for that.

Starting the basics:

  • apache install: sudo apt-get install apache2
  • python install: sudo apt-get install python

or

  • python3 install: sudo apt-get install python3

The first step, which in my PHP experience I never had to do, is not mentionned in the guides above is to enable CGI processing in apache.

sudo a2enmod cgi

This will automatically enable mod_cgid if your server is configured with a multi-threaded MPM, which was the case for me.

Then you can either make a folder in your site's path where your cgi files will live, or configure certain directories to handle certain file types as cgi scripts.
This is described well in the apache2 doc above, but essentially you to make all files in a cgi folder be executed, you would use this conf:

<Directory /srv/www/yoursite/public_html/cgi-bin>
        Options ExecCGI
        SetHandler cgi-script
    </Directory>

and to allow .py files to be executed as scripts in a particular folder you would use this conf:

    <Directory /srv/www/yoursite/public_html>
        Options +ExecCGI
        AddHandler cgi-script .py
    </Directory>

Once you have that, if you're running python 3, you can make a python script like this one, and stick it in whichever folder is configured for cgi:

    #!/usr/bin/env python
    # -*- coding: UTF-8 -*-# enable debugging
    import cgitb
    cgitb.enable()
    print("Content-Type: text/html;charset=utf-8")
    print()
    print("Hello World!")

You can change the first line from

#!/usr/bin/env python

to

#!/path/to/your/python/binary

such as

#!/usr/bin/python3

in case the default is python and you want this script to be parsed by python3

 

Going against the grain

For me, I have been an advocate of Linux from the first day I set eyes on Redhat 4.2 at the age of 14. I had my new PC for less than 6 months when I decided to wipe Windows95 from the disk, and install RedHat with its awesome NeXTSTEP window manager.

That was the first day I started to argue the point about the difference between Windows and Linux. Sure, Windows had loads of software, but Linux had so much potential, but still living with my parents meant that they wanted to use Windows if they needed to use a computer, they could not possibly use Linux.

It continues like this from thereonin. At school, I was the odd one, the teachers concerned that I may do something which in their eyes would be the equivalent of digital armageddon.

For me, Linux really took off when I discovered Usenet, where I started to discover more and more applications, scripts and more distributions of Linux, where I quickly moved onto Debian.

When I went to college, I found myself doing a course in IT in which the IT lecturers, and the IT support staff had not ever used anything other than Windows. They were stuck in their ways, only teaching and supporting Windows and Microsoft technologies.

Despite arguing my advocacy for Linux for several months, and requesting that I used an alternative to Windows, I was alway shunned to the point where I was warned that if I didn't conform, I would be kicked off the course.

Being restricted like this wasn't good for me, I soon dropped out, wanting to find a way to stretch my wings, and teach myself what I wanted to learn. I felt that Linux was certainly the way to go.

It was certainly the right thing to do. In every job I have had, I have brought in Linux. Each time, it has been the same. I promote, people seem me as being a bit odd, suggesting time after time that we should use Linux for X. Eventually wearing them down to the point that I get to do it, and every time, I have managed to deliver and beyond, whether its because it was more forgiving with some questionable hardware, or whether it was that it 'just worked'.

For me, my biggest win was working at an eduational establishment where the majority of the infrastructure was apple-based, with their 'crashproof technology', and their 'it just works' motto, I loved the fact that over six years I took critical services away from OSX Server to linux, seeing server uptimes over a year on Linux hardware, compared to the weekly reboots required by their fruity counterparts.

I found however, attitudes change when I became a developer. Once I start working with people who embrace technology, and don't sit on the rigid rails of Microsoft brand software, it was easy to convince people to use Linux servers, and also the benefits of using it day-to-day as a desktop, a staging system, and the basis of every new project I develop. Now, I am Senior Developer for a multi-million pound company, one of the fastest growing tech companies in the UK, and one of the top growing 200 tech companies in the EMEA. I put a lot of faith in the tools I use, and they have never let me down.

Even after 16 years, I still get strange looks, my wife still refuses to use Linux, and I can easily empty a room just by showing my passion. Despite all this, I continue on, promoting Linux, promoting Opensource technology, and always being there if anyone wanted help making the same change too.

 

My Nerd Story: Learn By Doing

My nerd story started in the winter of 2007, when I was 13 years old. And it all started from a simple challenge from a friend to look at some HTML tutorials, which I did. And I was captivated from the first second. I started doing very simple things like writing text and changing the background colour of a page, you know, beginner's HTML stuff, but I found it a lot of fun. For some reason, this stuff was second nature for me from day one.
Read more... Comment (0)
 

Collectl is a powerful tool to monitor system resources on Linux

Monitoring system resources Linux system admins often need to monitor system resources like cpu, memory, disk, network etc to make sure that the system is in a good condition. And there are plenty of commands like iotop, top, free, htop, sar etc to do the task. Today we shall take a look at a tool called collectl that can be used to measure, monitor and analyse system performance on linux. Collectl is a nifty little program that does a lot more than most other tools. It comes with a...

Read more... Comment (1)
 

My Nerd Story: Ham Radio, Atari, and UNIX

My geek story started early, probably because my dad and grandfather were into amateur radio (ham radio) in a pretty hard core way from the time I was little. I remember my dad studying for one of the morse code exams when I was maybe 4 or 5 years old, and me being the little sponge that I was, picked it up pretty easily. Nothing like a mouthy toddler shouting the answers to motivate someone to learn.

Read more... Comment (3)
 

Nmon – A Nifty Little Tool to Monitor System Resources on Linux

Nmon Nmon (Nigel's performance Monitor for Linux) is another very useful command line utility that can display information about various system resources like cpu, memory, disk, network etc. It was developed at IBM and later released open source. It is available for most common architectures like x86, ARM and platforms like linux, unix etc. It is interactive and the output is well organised similar to htop. Using Nmon it is possible to view the performance of different system resources on a single screen. The man page describes nmon as nmon is is a systems administrator, tuner,...

Read more... Comment (0)
 

Inxi is an amazing tool to check hardware information on Linux

Inxi A very common thing linux users struggle with is to find what hardware has the OS detected and how well. Because unless the OS is aware of the hardware, it might not be using it at all. And there an entire ocean of commands to check hardware information. There are quite a few gui tools like hardinfo, sysinfo etc on the desktop, but having a generic command line tool is far more useful and this is where Inxi works well. Inxi is a set of scripts that will detect a...
Read more... Comment (0)
 

Munich Transition Documentary

Just in general, if someone has connections with anyone over in Munich, what would be the possibility that a documentary covering their decade long migration to Linux could be assembled for the rest of us to understand better all the trials and tribulations that had to be overcome to bring the complete transition to fruition?

 

The benefits of a well planned Virtualization

One of the biggest challenges facing IT departments today is to keep your work environment. This is due to the need to maintain IT infrastructure able to meet the current demand for services and applications, and also ensure, that in the critical situations of the company, is able to resume normal activities quickly. And here is where it appears the big problem .

Much of IT departments are working on their physical limit, logical and economical . Your budget is very small and grows on average 1% a year, while managing the complexity grows at an exponential rate. IT has been viewed as a cost center real and not as an investment, as I have observed in most of the companies for which I have passed.

With this reality, IT professionals have to juggle to maintain a functional structure. For colleagues working in a similar reality, recommend special attention to this topic Virtualization .

Instead of speculating, Virtualization is not an expensive process compared to its benefits . Believing that depending on the scenario, Virtualization can be more expensive than many traditional designs. To give you an idea, today over 70% of the IT budget is spent just to keep the system environment, while less than 30% of the budget is invested in innovation advantage, differentiation and competitiveness. This means that almost any IT investment is dedicated simply to "put out the fire" emergency solve problems and very little is spent on solving the problem.

I followed a very common reality in the daily lives of large companies where the IT department is so overwhelmed that you can not measure the time to think again. In several of them, we see two completely different scenarios. A before and after Virtualization / cloud computing. In the first case, what we see is a bottleneck with reality drastic and resources to the limit. In the second, a scene of tranquility, guaranteed safe management and scalability.

Therefore, consider the proposal of Virtualization and discover what you can do for your department and therefore, for your company.

Within this reasoning, we have two maxims. The first: "Rethinking IT. The second: "Reinventing the business."

The big challenge for organizations is precisely this: rethink. What to do to transform technical consultants?

Increase efficiency and security

To the extent that the structure increases, so does the complexity of managing the environment. It is common to see data center dedicated to a single application. This is because the best practices for each service request that has a dedicated server. Obviously the metric is still valid, because without doubt this is the best option to avoid conflicts between applications, performance, etc.. Also, environments such as this are becoming increasingly detrimental as the processing capacity and memory are increasingly underutilized . On average, only 15% of the processing power is consumed by a server application, that is, over 80% of processing power and memory is actually no use.

Can you imagine the situation? We, first, we have virtually unused servers, while others need more resources, and ever lighter applications, the use of hardware is more powerful.

Another point that needs careful consideration is the safety of the environment. Imagine a database server with disk problem? What will be the difficulty of your company today? The time that your business needs to quote, purchase, receive, change and configure the environment to drop the item. During all this time, what was the problem?

Many companies are based in the cities / regions far from major centers and therefore may not think this hypothesis.

With Virtualization it does not, because we left the traditional scenario where we have a lot of servers, each hosting its own operating system and applications, and we go to a more modern and efficient.

In the image below, we can see the process of migrating the physical environment, where multiple servers to a virtual environment, where we have fewer physical servers or virtual servers hosting.

vmware1

By working with this technology and we have underutilized servers for different applications / services that are assigned to the same physical hardware, sharing CPU resources, memory, disk and network. This makes the average usage of this equipment can reach 85%. Moreover, fewer physical servers means less spending on supplies, memories, processors, means less purchasing power and cooling, and therefore fewer people to manage the structure.

Vmware2

At this point you should ask, but what about security? If now I have multiple servers running simultaneously on a single physical server I'm at the mercy of this server? What if equipment fails?

New thinking is not only the technology but how to implement this technology in the best way possible. Today VMware , the global leader in Virtualization and cloud computing, working with a technology cluster, enabling and ensuring high availability of their servers. Basically, if you have two or more servers that work together in the event of failure of any equipment, VMware identifies this fault and automatically restores all its services on another host. This is automatic, without IT staff intervention.

At runtime, the physical failure is simulated to test the high availability and security of the environment in the future, the response time is fairly quick. On average, each server can be restarted with 10 seconds, 30 seconds or up to 2 minutes between each server. In some scenarios, it is possible that the operating environment will restart in about 5 minutes.

Be ready quickly new services

In a virtualized environment, the availability of new services becomes a quick and easy task, since resources are managed by the Virtualization tool and not tied to a single physical machine. This way you can hire a virtual server resources only and therefore avoids waste. On the other hand, if demand is rapidly increasing daily can increase the amount of memory allocated to this server. This same reasoning applies to the records and processing.

Remember that you are limited by the amount of hardware present in the cluster, you can only increase the memory to a virtual server if this report is available in its physical environment. This ends underutilized servers, as it begins to manage their environment intelligently and dynamically, ensuring greater stability

Integrating resources through the cloud

Cloud computing is a reality, and there is no cloud without Virtualization. VMware provides a tool called vCloud with it is possible to have a private cloud using its virtual structure, all managed with a single tool.

Reinventing the Business

After rethinking, now is the time to change, now is the time to reap the rewards of having an optimized IT organization, we see that when we do a project structured high availability, security, capacity growth and technology everything becomes much easier in the benefits we can mention the following:

Respond quickly to expand its business

When working in a virtualized environment, you have to think in a professional manner to meet all your needs, you can meet the demand for new services, this is possible with VMware because it offers a new server configured in a few clicks, in five minutes and has offered a new server ready to use. Today becomes crucial, since the start time a new project is decreasing.

Increase focus on strategic activities

With the controlled environment, management is simple and it becomes easier to focus on the business. That's because you get almost all the information and operational work is to have a thought of IT in business, and that is to transform a technical consultant. Therefore, a team will be fully focused on technology and strategic decisions, and not another team as firefighters, are dedicated to put out the fires caused.

Aligning the IT departments decision making

Virtualization allows IT staff have the metric reporting and analysis. With these reports have in their hands a professional tool that will lead to a fairly simple language and understand the reality of their environment. Often, this information supports a negotiation with management and, therefore, the approval of the budget for the purchase of new equipment.

Well folks, that's all. I tried not to write too much, but it's hard to say something as important in less lines, I promise that future articles will discuss in detail a little more about VMware and how it works.

 

How to configure vsftpd to use SSL/TLS (FTPS) on CentOS/Ubuntu

Securing FTP Vsftpd is a widely used ftp server, and if you are setting it up on your server for transferring files, then be aware of the security issues that come along. The ftp protocol has weak security inherent to its design. It transfers all data in plain text (unencrypted), and on public/unsecure network this is something too risky. To fix the issue we have FTPS. It secures FTP communication by encrypting it with SSL/TLS. And this post shows how to setup SSL...
Read more... Comment (0)
 
Page 14 of 140

Upcoming Linux Foundation Courses

  1. LFS426 Linux Performance Tuning
    08 Sep » 11 Sep - New York
    Details
  2. LFS520 OpenStack Cloud Architecture and Deployment
    08 Sep » 11 Sep - Virtual
    Details
  3. LFD320 Linux Kernel Internals and Debugging
    15 Sep » 19 Sep - Virtual
    Details

View All Upcoming Courses


Who we are ?

The Linux Foundation is a non-profit consortium dedicated to the growth of Linux.

More About the foundation...

Frequent Questions

Join / Linux Training / Board