Linux.com

Home Linux Community Community Blogs

Community Blogs



Using lftp to synchronize folders with a FTP account

lftp is a powerfull FTP client than can be used to sync with a remote account. In Ubuntu 9.04 it is already installed so all you have to do is figure out how to use it. :)

First, you'll need 2 "scripts", one to download files from the remote FTP server to your computer an one to upload them from your computer to the server.

Download script:

Create a file named download.x with the following content:

open -u user,password -p [port] [server]
mirror -c -e /remote_directory /local_directory
exit

You will need to write your username and password; also specify the port, usually 22, and the server address (eg: ftp://domain.com - you can also use sftp://). Also insert the absolute paths to the remote and  local directories.

The effect of the option -e  in the second line is that files that don't exist anymore in the remote directory will be deleted from the local directory; you may want to change this if you don't need this option.

Upload script:

open -u user,password -p [port] [server]
mirror -c -e -R /local_directory /remote_directory
exit

 

There are only a few things changed in the upload script: the -R option is used because we want to upload from the local directory to the remote one. Also note that the order of the two folders changed from the download script.

There are many other options for lftp; just, you know, man lftp.

Now, to download the files from the remote FTP server to the local directory open a terminal an type in:

$ lftp -f download.x

Note: if the download.x file is not in your home directory, you'll have to write the path to it.

To upload the files to the remote directory use the command:

$ lftp -f upload.x

 

Hope this helps.

 

A secure remote folder share while traveling

Task, provide secure access to your home fileserver via Internet.

Instructions are geared for Debs...Ubuntu/Debian based systems.

Task: Set up a file repository on your home webserver, so that you can access your files anytime you are out connected to a hotspot, coffeeshop or through tethered your ATT phone via bluetooth* (*see my other blog for how to do this one)

Add ssh to your webserver, apt-get install sshd

Harden SSH so that it is more secure, 1. change the default port, 2. disallow root access.  3.  specify only needed users.

1.. For security reasons, we move ssh from port 22 to something higher up..like 10022 for example.. (network scanners are less likely to find you and attempt to break in via brute force username/password attack)

You do this by modifying /etc/ssh/sshd_config and change the statement "port 22" to a port number above 1024 and below 65535,       port 10022

2..  then change "PermitRootLogin yes" to "PermitRootLogin no"

3.. Then add a statement that restrictrs who can login, keep it minimal like this: AllowUsers foo1 foo2

Restart ssh like this:   sudo /etc/init.d/ssh restart

 

now, test your login by attempting to log in ssh as root...should be denied, then try your login name that you spec'd up above in #3

works?   good, so far so good...

Now move along to your laptop, Eee machine, or whatever you carry around with you...  On Ubuntu, go to "places" then "connect to server", then select SSH from the service type pull down menu.  In the Server box, type in your webserver's IP address, in the PORT box put in whatever you used in   #1 above.

Username will be your user name that you allowed in #3 above.

Then when you hit connect, it will prompt for a password, which you enter, and then choose to "remember it forever".

Once this is done your "remote" folder will show up in your file browser, to pull files from your home server as needed to your remote device.

Enjoy..!

Jim

 

 

Been using Wine to play some old windows games.

Strangely these 6 to 8 year old commercial windows games run better under Linux using wine than they do on a fairly modern Windows XP box.

 Myth, Starcraft, SimCity3000 all just worked on Linux.

 I have also found Linux based Clients for several games that work perfectly. The quake 2 and 3 source code compiled very easily on my system and worked flawlessly. 

 Source code for Quake 2: http://www.icculus.org/quake2/#download

I had to install a few libXxx* libraries to get the code to compile.  To get sound to work I had to edit the Make file and have it build sdlquake and run that version to get sound working.  

 I used a couple of set commands to set the 1280x800 resolution of my monitor.: 

quake set r_customwidth 1280 set r_customheight 800 set r_mode -1

Source code for Quake 3: http://ioquake3.org/source-codes/

Here is an interesting site that analyzes Quake source code: http://fabiensanglard.net/quakeSource/quakeSourceNetWork.php

The following site has a lot of installers for many older games to run them under Linux:  http://www.liflg.org/?catid=3 

 I also got the Linux version of Myth II Soulblighter installed using these directions :  http://grokthink.com/wordpress/?p=184

 I haven't played any of these games for years, one nice thing about being unemployed.  :)

 Happy gaming.

 

Shovelling Up the Mess left behind by Windows

One of the interesting uses for Linux nowadays is to re-gain compatibility with older Software and Hardware. As far as software is concerned, this can be broadly divided into two groups: Applications and Entertainment (mainly Games).

Application Software

There isn't much call for running older applications beyond a certain age amongst home users; and any company who knows what it is doing will migrate it's data and document formats to newer versions of software as they are procured.

If an application vendor goes out of business, an older version of their software can continue to be run unless there is a conflict with a newer operating system, at which point Virtualization or Emulation can take over.

Of course, the ideal solution is to convert the data into platform-neutral formats, with standards defined by a consortium. This gives businesses the assurance that they will still have access to their data through solutions from competing vendors.

Entertainment Software

For many years now, some of us have been using (or writing!) emulation software to play our favourite titles from platforms which are now consigned to history. We accept that this workaround is needed, as the hardware and operating system we are using now are so radically different to what the title originally required. There is a label describing the hardware / software required to run the title, such as 'ZX Spectrum', 'Amiga' or 'Atari ST', and we can therefore accept that we need an Emulator to run on 'Linux', 'Windows' or 'Wii'.

But 'PC' games are a different matter. There is a whole slew of older games for DOS / Windows 3.x / Windows 9x that will just not run on NT-based Windows, or require work arounds to work, often performing poorly in the process.  Worse still, x64 Windows systems do not have the NTVDM necessary to run DOS programs at all, and support for 16-bit Windows applications has been removed entirely.

 The fact is that nowadays, an x86 / x64 Linux machine equipped with Dosbox, Dosemu, and Wine (including the propreitary forks such as Cedega and Crossover Games) is in a far better position to run these older games than a modern off-the-shelf Windows system. Dosbox also contains an x86 CPU emulator, and can thus be run on any supported architecture.

Hardware

Modern Windows systems are great for supporting new hardware; Linux less so. You buy a £5 webcam today, and it's virtually guaranteed to work on Windows (and 'Mac' if it is written on the label). If the hardware manufacturer is serious about supporting Open Source, and release specifications and / or reference drivers, than you are fairly likely to find a means of getting the device to work on Linux or other Open Source systems.

Otherwise you are relying on a talented person who knows the necessary driver API's (either kernelspace, such as v4l, or userspace, such as SANE) and is prepared to experiment with the hardware, either for their own use or for a bounty, to write the necessary driver support. However, once the support is in Linux, it usually stays. Even if the API's change, there are usually  enough interested maintainers who will modify the code as necessary to make sure that older drivers still compile.

This is in stark contrast to Windows, where all you get from the manufacturer is binary blobs, a few mystic 'sys' and 'inf' files. If Microsoft change their kernel ABI, or port Windows to a new architecture (as x64 arguably is, as you cannot mix x86 and x64 code in a process, least of all the kernel!) then older drivers will break. If the manufacturer has gone out of business, or wishes to not support a device, the end user loses.

Microsoft have also made this situation worse - all drivers installed onto Vista x64 SP1 have to be signed, effectively ending any hopes at writing Kernel-Mode drivers by enthusiastic Windows users (were there ever any?), or companies who don't consider it economically viable to pay WHQL fees to audit the drivers for older devices, when they could just sell a new device.

I have a specific example here - my mother was using a 'Packard Bell'-branded Mustek scanner under Windows. Support for these drivers was dicy under Windows XP SP2 and onwards, often requiring her to reboot after scanning a single page before she could scan another. However, under SANE on Linux it works perfectly under the 'gt68xx' backend, and is likely to do so for many years to come. Yes, the software is more complicated for her to use, but she is relieved that it works flawlessly, and is grateful it will continue to do so.

Thoughts

I'm hoping that Microsoft's power-hungry grab for the brand-label of 'PC' ("Personal Computer") to describe their particular Operating System running on a particular CPU / Hardware architecture will be a part of their undoing.  The fact that in earlier days people were attracted to the 'PC' brand for the large software and hardware compatability will prove that they cannot even be compatible with their own standards, which the community has had to provide.

It's not just Microsoft though; by clinging to the brand name of 'Macintosh', Apple has made a similar rod for their own back. If you were using OSX prior to 10.5 on a PPC machine, thanks to the 'Classic' emulator you would be guaranteed to be able to run almost any 'Macintosh' title. However, the shift to 10.5 and Intel processors has ended their 'support' of this. Personally, i  believe the label 'Macintosh' to be highly irrelevant in describing modern Apple systems.

 Conclusions

Having now forgotten the enthusiasts that put them where they now are, Microsoft and Apple haven't banked on them still being around, and being able to influence the decisions of non-enthusiasts. We want to carry on running older games. We don't want to have to ditch perfectly good peripherals due to an upgrade of an operating system.

As those PC users (who know enough to be able to support themselves) try to use their older peripherals and software and find that they fail to work on Windows, yet work fine in Linux, the knowledge and reputation of Linux will increase. So the future is actually kinda bright for us, at least in this area!

Although it might seem like flame bait to suggest it on a Linux site, i'm kind of hoping ReactOS achieves good XP compatability and look and feel. Users are reluctant to change, and at least they can carry on with their Windows habits whilst  embracing the freedom that Open Source software brings.

 Richard Foulkes is not a Technical Journalist. He holds no fashionable Industry qualfications and has never spoken at a conference or even been to one. He makes no claim of being an IT professional. He is also not currently employed.

 

First post, or my Linux experiences

Since this is my first post, I thought that I'd just write a bit about what I'm doing here on Linux.com and Linux at all.

My first experience with Linux came several years ago, when I was still very naive about computers and thought Windows was the best thing since the computer itself. I found Red Hat 7, got it downloaded, and installed it on an old box. I was actually pretty decent with it. Then, I forgot the root password, and half-formatted the drive while trying to install Debian in all its 22-disc glory. (I would later format another drive trying to install Debian, as well.)

Later, I switched over from Windows - partially due to my frustration with its slowness - to Mac. (My dad is still a Mac fan to this day.) I loved it, and it was on Mac that I got my first real experience with Python for Real Programming. (Another computer - this one a Power Mac - nearly succumbed to my Debian disk formatting during this period, but fortunately it wouldn't read the discs.) Just for fun, I decided to set up a Web server on a ten-year-old computer I would later call Adelie. I installed a distribution on it called "Ubuntu Server," and due to my Mac experience with the command line, had little trouble configuring it.

A month or two later, I built a kick-awesome desktop tower named Rockhopper. Dual-core processor, 1760M of RAM, and 500GB disk. Based on my previous experiences with the Ubuntu Server, I decided to dual boot Windows XP Pro and the standard Ubuntu desktop. I ended up using Ubuntu full-time, and recently purged XP from the computer to reclaim the full 500GB.

Eventually, I traded out GNOME for Xfce, installed Ubuntu on a laptop too, and set up GoboLinux on some random box. (I never realized how long compiling from source can really take.) In fact, one of the things I love about Linux is how changable everything is.

Linux.com seems like a really good idea to me. I'm hoping that I'll be able to share my expertise beyond the Ubuntu forums, and write some interesting blog articles. It's getting late where I live, so I'll finish for tonight, but I'll probably try and do something on here tomorrow.

 

Finally!

I have been waiting for this.

I use to be a "regular" of Linux.com...

The gestation was long (it's always like that when you wait...), but now I appreciate why!

Bravo for all the work - this new web site is an impressive achievement.

Im'happy, because I've become free - totally free - when the last barrier - an iMac - fell 2 weeks ago. It was replaced, at my request to my employer, by a System76 machine running Ubuntu. So now, at work and at home, I'm free! My wife and I run Ubuntu on all our computers - and we're happy with it!

And all this because some guy, in Finland, some 20 years ago, decided he too wanted to be free!

Need I say more... ?

 

So Tragic

Ive collected some cute things when I was in Linux World expo last time. I have this rubber penguin keychain and used to hang it on my beg.

Isnt it cute?

Before

Before

 

After

 After

Isnt it tragic :(

I brought it everywhere I go, and many times Ive realised it was stucked inside the car when I closed the car door, the building door and everything else. Will find a new one :) 

 

 

Testing mail servers with swaks

Article Source: http://www.cmdln.org
Date: April 16th 2009

 I hadn’t seen this tool before so I figured I would share. Swaks is the swiss army knife SMTP according to the homepage.

Full Entry

 

A Microsoft problem, or a "Me" problem

I'll be the first to admit that I probably should know Windows better (being in the IT field, after all) than I should, but sometimes I just have this feeling of "why should I when I know the linux way better?" Of course, not everything is that black and white, but perhaps an example is in order.

Today, I was at a friend's house, and they wanted help backing up a hard drive. Now, by some odd piece of work, it was a SATA drive in which the enclosure didn't work for some reason or another. Now, I do have an IDE/SATA to USB converter, so I offered to rip the drive to an ISO for them. They declined and just hooked the hard drive up to their computer. Alas, the drive spun up but was not detected by windows, and their tools apparently won't rip ISOs if it can't detect the drive. So, I offer again to try and manage to detect the drive through "dmesg | tail", but cannot mount it. Either way, that didn't stop me from dd-ing the drive, but the hard drive space of my netbook was definitely a limiting factor.

Since I was going to be there for a while, I plugged in the netbook and had them share a folder on their computer and had it mounted a second later (after apt-getting smbfs from the ubuntu repositories) across the network. After that, it was relatively smooth sailing until it finished (though, I did run ` watch -n 1 -d "ls -alh backup.iso | cut -d ' ' -f5" ' in another window to keep track of the general progress of the process).

Of course, at this point I am stuck with the thought of "most of these tools are common tools on linux", and immediatly follow it up with the thought "why can't windows have such tools on hand?". Now, I grant that similar tools probably exist on Windows, but why would I want to have them when I know that they are built-in on linux?

To me, linux just isn't about being pretty; it's also about having funtion, and I really appreciate the days when a tiny laptop that I just use for surfing the web and chatting with friends comes in and helps solve a big issue, while the "big windows computers" are forced on the sidelines, due to lack of functionality. Why do I derive such enjoyment from such a thought? Perhaps it's because I always hear "oh, but linux isn't compatible" and "but linux can't do that" all the time, and it's nice to actually have the power to show them that linux isn't just for big servers locked up in a room somewhere.

 

What to do when you can't ping a computer

What do you do when you need to check the status of a computer / device on your network that doesn't respond to pings?

A lot of computers / devices on a network are configured to ignore ping requests. This can sometimes leave you wondering if they are really offline. 

internet:~ # ping 10.1.2.161
PING 10.1.2.161 (10.1.2.161) 56(84) bytes of data.

--- 10.1.2.161 ping statistics ---
5 packets transmitted, 0 received, 100% packet loss, time 4009ms

So how do you ping something that doesn't reply?

Read more... Comment (0)
 

Using Shell commands to copy, move and remove files —— Chinese Simplified

cp命令

该命令的功能是将给出的文件或目录拷贝到另一文件或目录中,同MSDOS下的copy命令一样,功能十分强大。

语法: cp [选项] 源文件或目录 目标文件或目录

说明:该命令把指定的源文件复制到目标文件或把多个源文件复制到目标目录中。

该命令的各选项含义如下:

- a 该选项通常在拷贝目录时使用。它保留链接、文件属性,并递归地拷贝目录,其作用等于dpR选项的组合。

- d 拷贝时保留链接。

- f 删除已经存在的目标文件而不提示。

- i 和f选项相反,在覆盖目标文件之前将给出提示要求用户确认。回答y时目标文件将被覆盖,是交互式拷贝。

- p 此时cp除复制源文件的内容外,还将把其修改时间和访问权限也复制到新文件中。

- r 若给出的源文件是一目录文件,此时cp将递归复制该目录下所有的子目录和文件。此时目标文件必须为一个目录名。

- l 不作拷贝,只是链接文件。

  需要说明的是,为防止用户在不经意的情况下用cp命令破坏另一个文件,如用户指定的目标文件名已存在,用cp命令拷贝文件后,这个文件就会被新源文件覆盖,因此,建议用户在使用cp命令拷贝文件时,最好使用i选项。

 

mv命令

用户可以使用mv命令来为文件或目录改名或将文件由一个目录移入另一个目录中。该命令如同MSDOS下的ren和move的组合。

语法:mv [选项] 源文件或目录 目标文件或目录

说明:视mv命令中第二个参数类型的不同(是目标文件还是目标目录),mv命令将文件重命名或将其移至一个新的目录中。当第二个参数类型是文件时,mv命令完成文件重命名,此时,源文件只能有一个(也可以是源目录名),它将所给的源文件或目录重命名为给定的目标文件名。当第二个参数是已存在的目录名称时,源文件或目录参数可以有多个,mv命令将各参数指定的源文件均移至目标目录中。在跨文件系统移动文件时,mv先拷贝,再将原有文件删除,而链至该文件的链接也将丢失。

命令中各选项的含义为:

- I 交互方式操作。如果mv操作将导致对已存在的目标文件的覆盖,此时系统询问是否重写,要求用户回答y或n,这样可以避免误覆盖文件。

- f 禁止交互操作。在mv操作要覆盖某已有的目标文件时不给任何指示,指定此选项后,i选项将不再起作用。

如果所给目标文件(不是目录)已存在,此时该文件的内容将被新文件覆盖。为防止用户用mv命令破坏另一个文件,使用mv命令移动文件时,最好使用i选项。

rm命令

用户可以用rm命令删除不需要的文件。该命令的功能为删除一个目录中的一个或多个文件或目录,它也可以将某个目录及其下的所有文件及子目录均删除。对于链接文件,只是断开了链接,原文件保持不变。

rm命令的一般形式为:

rm [选项] 文件...

如果没有使用- r选项,则rm不会删除目录。

该命令的各选项含义如下:

- f 忽略不存在的文件,从不给出提示。

- r 指示rm将参数中列出的全部目录和子目录均递归地删除。

- i 进行交互式删除。

使用rm命令要小心。因为一旦文件被删除,它是不能被恢复的。了防止这种情况的发生,可以使用i选项来逐个确认要删除的文件。如果用户输入y,文件将被删除。如果输入任何其他东西,文件则不会删除。

 
Page 121 of 142

Upcoming Linux Foundation Courses

  1. LFD331 Developing Linux Device Drivers
    13 Oct » 17 Oct - Virtual
    Details
  2. LFS425 Linux Performance Tuning Crash Course
    16 Oct » 16 Oct - Düsseldorf, Germany
    Details
  3. LFS220 Linux System Administration
    20 Oct » 23 Oct - Virtual
    Details

View All Upcoming Courses


Who we are ?

The Linux Foundation is a non-profit consortium dedicated to the growth of Linux.

More About the foundation...

Frequent Questions

Join / Linux Training / Board