Author: JT Smith
exposed all customer records on about 4,000 Web sites. The FBI issued a public
warning directed at the software’s customers, but a small e-commerce Web site
named SawyerDesign.com didn’t notice.”
Category:
- Linux
Author: JT Smith
Category:
Author: JT Smith
Category:
Author: JT Smith
Category:
Author: JT Smith
Stallman`s main agenda will be to launch the Indian chapter of FSF and lecture on the purpose, goals, philosophy, methods, status and future prospects of the GNU operating system, which, in combination with the Linux kernel, is now used by an estimated 17 to 20 million users worldwide.
Stallman`s trip is being organized by the Free Software Foundation of
India (FSF-I), founded in Trivandrum, Kerala by a group of dedicated users of free software. Free software is defined by the FSF in the sense of freedom as in `free speech`, not gratis, as in `free beer`.
Stallman founded FSF in 1985, dedicating it to promote computer users` rights to use, study, copy, modify and redistribute computer programs. In particular, FSF promotes the GNU operating system (GNU is a recursive acronym for `GNU`s Not Unix`), used widely today in its GNU/Linux variant, often mistakenly called just `Linux`.
FSF also helps to spread awareness of the ethical and political issues of freedom in the use of software. The FSF believes that free software is a matter of freedom, not price.
The GNU General Public Licence (GNU GPL) gives each user the freedom to run, copy, distribute, study, change and improve the software, based on unfettered access to the source code Being free to do this means (among other things) that you do not have to ask or pay for permission.
While granting the user these freedoms, the GNU GPL defends them by saying that no one is allowed to take them away from anyone else. Any published program, which incorporates all or a substantial part of a GPL-covered program, must itself be released under the GNU GPL. The GPL ensures that no person or community can privatize the community?s free software.
According to FSF-I, a developing country like India should promote and encourage the use of free software not only because India is economically backward and cannot afford expensive, proprietary solutions, but also because of the `digital divide` resulting from the country`s diversity in language and literacy levels, as well as access to computers and bandwidth. Free software can help bridge this divide by encouraging solidarity, collaboration and voluntary community work amongst programmers and computer users, says FSF-I.
Stallman`s visit to India will come on the heels of the recent attack on GPL by Microsoft, which feels threatened by the fast growth of the free software movement. (Stallman and FSF take pain to distinguish `free software` from `open source`, which is a term that, since 1998, has been used by another group rallying around another celebrated hacker, Eric Raymond.)
Stallman`s itinerary in India will include a talk at NCST, Bombay, the launch of FSF-I in Trivandrum, lectures at engineering colleges in Trivandrum and Cochin, a session at Technopark and a trip to Baroda.
FSF-I expects the Stallman visit to position Kerala as the Free Software capital of India and to launch FSF-I`s activities within the country on issues relating to training, support, and distribution and dissemination of free software.
I first posted this story at http://www.myiris.com/NewsCentre/index.php“
Author: JT Smith
Next to the building of the network there will be lots of activities. These activities include a lot of presentations and workshops (ipv6, Linux, security, networking).
NE2000 is meant for social and nice people. Especialy people who enjoy meeting other people. NE2000 is NOT meant for people that come to lock themselves up in a tent of caravan to play Quake during the whole day, It’s meant to explore new things together with other people.
The event will take place from 19 t/m 25 July in the netherlands, please visit http://www.ne2000.nl for more information.”
Author: JT Smith
- DOES OPEN SOURCE THREATEN INTELLECTUAL PROPERTY ? - What is the difference between GPL and Microsoft Shared Source strategy? - Does Open Source create security risks? - Does Open Souce create software instability? Panelists will include: Jon 'maddog' Hall, Executive Director, Linux International Bernard Lang, Directeur de Recherche, INRIA, Secrétaire de l'AFUL -- Association Francophone des Utilisateurs de Linux et des Logiciels Libres, and representatives from Borland, Compaq, Intel, Quadratec, SGI, SuSE, and Xybernaut. The panel is open to all. Details: Linux@work Paris, June 13, 2001 http://www.ltt.de/linux_at_work.2001/paris.shtml ##"
Author: JT Smith
Author: JT Smith
Quality, as he sees it, implies two things. At the most basic level,
software should be free of fatal errors, and it should perform its advertised
functions correctly. At a higher level of quality, software should be free of
annoying flaws, and closed bugs should stay resolved. A defect uncovered in
one place should be fixed everywhere else it occurs. It’s easy to write code
that looks correct, but nothing takes the place of actually running it.
Organized by programmers, for programmers
As more people discover Linux, the potential for end-user feedback
increases. More users mean more configurations, more hardware in use, and
more interactions, all of which may uncover subtle bugs. Big projects like
Mozilla, GNOME, and KDE have comprehensive bug databases, but nothing similar
exists for the kernel. Part of that comes from its unique culture.
Much of the kernel development discussion takes place on lkml, the Linux
kernel mailing list. It can easily see a thousand messages a week. Keeping up
is a Herculean effort. As Crawford discovered, it can overwhelm an end-user
only interested in the status of a bug or a feature. Where would a harried
system administrator or application developer start to look for a solution?
There are voluminous lkml
archives, and there’s always Google,
but locating and aggregating the latest news is a chore.
It’s not that developers don’t crave good feedback, but testing is hard,
unglamorous work. Crawford points out that Linux creator Linus Torvalds has mentioned the need for more testing several times, but it doesn’t compare to the thrill of writing new code. Like debugging or writing documentation, testing can be tedious and time-consuming. Beyond that, it takes a unique set of talents to produce good tests and automated test suites.
Adding users to the mix
That’s where Crawford comes in. If the Quality Database takes off, it will
allow users and developers to correlate defects with kernel versions. The more
people who participate, the easier it will be to focus in on the exact problem
and create a solution. Even a report as simple as, “My network card doesn’t
initialize under 2.4.4 with the Tulip driver but worked fine with 2.4.3,” could
narrow the issue down for other users, especially if it eventually included an
explanation or a fix.
The reaction so far has been mixed, at least from the kernel developers.
Several responded positively to the initial announcement. Others wanted feedback
from Torvalds and chief kernel hacker Alan Cox before signing on. Crawford has yet to receive a formal blessing.
Part of the difficulty comes from the kernel development process itself.
Torvalds has resisted putting the kernel into a version control system, preferring
to work sequentially through e-mailed patches. Several lieutenants hold sway
over important subsystems, like the virtual memory system or the filesystem.
Despite this apparent chaos, things come together. Still, interested developers
unfamiliar with the subtle protocols of lkml have to adapt to the ultimate
bazaar, if they want to see results.
Attempting to avoid several recurrent flamewars, Crawford emphasizes his
desire to work within the current system. “[It’s] not my plan to try to force a
bunch of big-company software process into the Linux kernel development. I want
it to work as well as possible with what they already have.” Instead of
targeting existing developers, he wants the database to act as a sort of bridge
between users and kernel hackers. To the users, it will be a repository of
defects, versions, and solutions to common problems. To coders, it will be
a source of error messages, configurations, and defects from the field.
The road forward
While recruiting several motivated people to run informal tests by hand
will add valuable data, other ideas can refine the process. One correspondent
brought up the issue of test coverage. How can developers know that every
component — indeed, every line of the kernel has a test, somewhere?
An article on the database site links to several automated test suites for
userland software, including Mesa and Python. These exercise certain parts of
the kernel and system libraries. The closest thing in kernel space proper comes
from SGI. It’s nowhere near comprehensive, but it’s a place to start. One can
almost imagine a thousand boxes downloading the latest -ac kernels and running
nightly smoke tests. (The Perl
Smokers group has a similar system already in place for Perl’s development
branch.) If testing a new release and reporting failures is as easy as “make
test,” this will produce an incredible amount of valuable data in a very short
time.
Of course, there are millions of lines of code, several different
architectures, thousands of supported and slightly-different pieces of
hardware, and hundreds of configuration options. Even worse, consider the
experimental options, system libraries, BIOS bugs, and distribution variations
that must be taken into account. Working that close to bare metal leaves
little room for a safety net. But the world needs idealists, and writing a
kernel is also hard work.
Getting involved
When the Quality Database itself goes live, it will provide a Web interface
to report and to read reports of failures and successes. Ideally, this will be
the first line for puzzled users and administrators. The more data a report contains,
the more likely it can pinpoint the source of a bug or a solution.
Recent Linux adopters may fear the imposing-until-you-do-it world of
kernel
recompilation, but that’s the easiest way to start testing. If a new kernel boots
on your system, it’s already passed the most important test. (The people who
package the kernel for their distributions often produce useful patches trying
to make things work for as many people as possible.)
For people with more know-how and motivation, Crawford’s article on validating
the kernel links to several available tests. There’s no Holy Grail of
comprehensive kernel suites yet, but the need and the opportunity are there.
Programmers interested in kernel hacking could easily get their feet wet.
Clearly, this is an idea whose time has come. The most compelling
part of Crawford’s vision is that it’s within reach of average users. It
doesn’t require a degree in computer science or hours of free time, just a few
reports here, and a few tests there. The more people who invest in
improving software quality, the greater the payoff.
Category:
Author: JT Smith
Author: JT Smith
Category: