David knocked down Goliath. Can Google knock down Windows?
Last year, the Internet search giant announced that it would be releasing Chrome OS, a platform that hopes to challenge Windows. At first glance, the comparison is like a marshmallow trying to survive a charging elephant; the logistics of it enhance the image. Windows runs most of the world's computers, and Google's field is in its search engine. Expecting it to best one of the OS giants can seem impossible.
But is it?
Google's marketing plan ties into its business practices. It earns most of its money through advertisement, and with the Chrome OS gaining interest, Google will have more opportunities to place ads. An operating system designed by the world's largest search engine will generate publicity, which will increase revenue. Even if people never buy netbooks that run on Chrome, the advertisement and attention to Google are enough.
But Google's challenger has more than twenty-five years of experience behind it. After more than two decades of development and debugging, Windows runs most of the world's computers and handles most applications. Users need an incentive to seriously consider Chrome OS. So far, it touts a more streamlined operating system; it doesn't necessarily mean that it's faster or more versatile than Windows. Even Windows at its worst allows users to do more. Chrome OS has to fill a need that Windows can't.
On the other hand, if Chrome OS is simply built upon everything that's been written (as opposed to starting from scratch), then Google can focus on meeting the needs of consumers by streamlining and making the OS more versatile.
With the OS coming soon to netbooks, the first true test will come; if Chrome performs better than Windows in this small arena, it could pose a challenge to Windows in the long run. Still, with Windows boasting twenty-five years of experience and success, David will have a formidable Goliath to knock down.
Friday, April 23, 2010
Friday, April 16, 2010
Why use Linux?
Apple promoted the revolution of desktop computers. Windows built upon the mass-production of PCs. UNIX revolutionized the graphical user interface. But one particular operating system has gained attention of its own, and all thanks to the work of a student who spearheaded its development.
Linux is the name of a kernel and operating system created by Linus Torvalds. Linux was the kernel that completed Richard Stallman's GNU Project, thereby completing a free-software operating system. Linux is free and Open Source; it costs no money to download and use, and it allows users--namely those unaffiliated with Microsoft, Apple, and other major computer corporations--to contribute their own code to the project.1
Linux is a stable platform, offering greater protection from viruses and reducing the risk of system crashes. In theory, this makes it an ideal operating system for anyone (although, in practice, there isn't much major software such as Adobe's products that can use Linux). But the fact that Linux is Open Source has allowed programmers and major corporations to use the OS's stability. Google, Amazon.com, DreamWorks, and Industrial Light and Magic have used (and continue to use) Linux, and government agencies are adopting the OS into its servers and computers.2
Ever since I got a Mac, I've learned about Linux, UNIX, and X11 to the point where I want to use the OS. Anything that provides greater stability would be great, but for me to use it, it would have to fill a need or a want. Right now, it's curiosity only, but that'll change once I use it. Part of me wants to learn how to program in C and write code for it, too, but until I know how to do it, I don't think I'd be using Linux as much as I'd like.
But based on what I've heard, I'd love to explore Linux and see what it does. Computers have always fascinated me, and seeing something different from Mac and Windows fascinates me. And if I can learn how to program code in C, Linux will be an adventure that I'd have the will and the skill to explore. For me, it's an opportunity just waiting to be taken.
--
Works cited:
1. "The GNU Manifesto"
2. "Who Uses Linux?"
Linux is the name of a kernel and operating system created by Linus Torvalds. Linux was the kernel that completed Richard Stallman's GNU Project, thereby completing a free-software operating system. Linux is free and Open Source; it costs no money to download and use, and it allows users--namely those unaffiliated with Microsoft, Apple, and other major computer corporations--to contribute their own code to the project.1
Linux is a stable platform, offering greater protection from viruses and reducing the risk of system crashes. In theory, this makes it an ideal operating system for anyone (although, in practice, there isn't much major software such as Adobe's products that can use Linux). But the fact that Linux is Open Source has allowed programmers and major corporations to use the OS's stability. Google, Amazon.com, DreamWorks, and Industrial Light and Magic have used (and continue to use) Linux, and government agencies are adopting the OS into its servers and computers.2
Ever since I got a Mac, I've learned about Linux, UNIX, and X11 to the point where I want to use the OS. Anything that provides greater stability would be great, but for me to use it, it would have to fill a need or a want. Right now, it's curiosity only, but that'll change once I use it. Part of me wants to learn how to program in C and write code for it, too, but until I know how to do it, I don't think I'd be using Linux as much as I'd like.
But based on what I've heard, I'd love to explore Linux and see what it does. Computers have always fascinated me, and seeing something different from Mac and Windows fascinates me. And if I can learn how to program code in C, Linux will be an adventure that I'd have the will and the skill to explore. For me, it's an opportunity just waiting to be taken.
--
Works cited:
1. "The GNU Manifesto"
2. "Who Uses Linux?"
Sunday, April 11, 2010
When computers attack--the threat and the response
Every technology is designed to accomplish results, but those that are designed for good can be used as a form of crime. The Internet is no exception. In January 2010, a Chinese cyberattack hacked into several Gmail accounts; the sophistication of the attack revealed that cybercrime has a more profound effect on the digital world than once believed. Mobs of programmers and hackers have successfully hacked into websites created by companies, banks, and even the U.S. government, and the same type of code used in the Google attack is only one means of creating and launching attacks.
Long before the days of WarGames (1983), the Internet has proven to be surprisingly vulnerable and open—a gateway to the hacking of computers and possibly to the destruction of the Internet itself. Today, extortion and fraud target personal consumers and political targets alike. As a result, computer programmers have taken a stand, and the battle against cybercrime on all fronts has come to light.
Cybercrime fighter Barrett Lyon explains that today's security measures aren't enough to fend off hundreds of thousands of computers programmed to accomplish the same thing all at once. Most of today's computers have common weaknesses, which make them susceptible to software that identifies them. If enough computers attack a site at the same time, then they can successfully tank a website. This is the basis behind an attack in which hackers crash websites via massive traffic overload. All of the bots—the computers which are simultaneously under the hackers' control—overwhelm the website by logging in or visiting it at the same time. This helps denial-of-service attacks, extortion, and theft succeed.
But Lyon's work has helped companies and law enforcement combat cyberattacks. His experience as a professional cybercrime fighter comes from his self-taught computer experience and his own history of hacking; as a teenager, he hacked into AOL and deleted the domain name, which took the site offline for three days and got the attention of the news and the FBI. As he understood more of the weaknesses of computers, he used his skills to divert and fight attacks.
At the start of his career, he saw signatures in a series of intense attacks; this helped him find the source of the attacks, and he even went undercover into the cybermafia as a Russian hacker to learn more. He gained the confidence of a Russian hacker, who went by the nickname "exe" (which stood for "extremist" instead of the file format "executable file"); Lyon posted the nickname on large public chat rooms, which revealed the false domain name the hacker was hiding behind. In the end, the domain name revealed the hacker's curriculum vitae in the registration records.
Joseph Menn, author of Fatal System Error, listed "exe" as one of his more memorable cybercriminals. "exe" was like Lyon—self-taught computer wiz at a young age. "exe," whose real name was Ivan, began writing code that acted like a virus—spreading from one bot to many others. Menn has shown the increase of serious cyberattacks across the world. Denial-of-service attacks have long existed, but in recent days these attacks have targeted government and media organizations. In Estonia and the former Soviet republic of Georgia, these attacks have been used to shut down government and media websites. In the United States, stolen military secrets are among the greatest scares. Attacks like these still happen because one out of every seven computers could be bots, and most people don't realize it or know how to prevent it.
Menn adds that the technology behind the threat isn't the only thing that keeps cybercriminals from being prosecuted. The struggle is not in the streets or in the drug market, but in the world theater. In Russia and China, hackers are an asset that the governments and their Mafias can use, especially if they know how to coordinate thousands of computers to launch major attacks in the United States. The Google attacks in January 2010 are directly correlated to the Chinese government, but this isn't all. Hackers have managed to retrieve our military secrets and have the potential to hack into our power grids, and all through a technology that was designed to accomplish good.
With terrorists gaining interest in computer hackers and nations trading military secrets and draining our economy, is there hope? Menn and Lyon don't see much of it. Menn reminds us that the hacking leads to a trillion-dollar drain on the economy, especially in online commerce. Lyon sees an increasing paranoia in the security industry; it's an erosion of trust, one which decreases the effectiveness of fighting any kind of crime. Unlike weapons with specific purposes, people can do anything they want with software. Until greater and more effective security measures are created, the technology reminds us that the greatest of technology can lead to the greatest of crime.
Long before the days of WarGames (1983), the Internet has proven to be surprisingly vulnerable and open—a gateway to the hacking of computers and possibly to the destruction of the Internet itself. Today, extortion and fraud target personal consumers and political targets alike. As a result, computer programmers have taken a stand, and the battle against cybercrime on all fronts has come to light.
Cybercrime fighter Barrett Lyon explains that today's security measures aren't enough to fend off hundreds of thousands of computers programmed to accomplish the same thing all at once. Most of today's computers have common weaknesses, which make them susceptible to software that identifies them. If enough computers attack a site at the same time, then they can successfully tank a website. This is the basis behind an attack in which hackers crash websites via massive traffic overload. All of the bots—the computers which are simultaneously under the hackers' control—overwhelm the website by logging in or visiting it at the same time. This helps denial-of-service attacks, extortion, and theft succeed.
But Lyon's work has helped companies and law enforcement combat cyberattacks. His experience as a professional cybercrime fighter comes from his self-taught computer experience and his own history of hacking; as a teenager, he hacked into AOL and deleted the domain name, which took the site offline for three days and got the attention of the news and the FBI. As he understood more of the weaknesses of computers, he used his skills to divert and fight attacks.
At the start of his career, he saw signatures in a series of intense attacks; this helped him find the source of the attacks, and he even went undercover into the cybermafia as a Russian hacker to learn more. He gained the confidence of a Russian hacker, who went by the nickname "exe" (which stood for "extremist" instead of the file format "executable file"); Lyon posted the nickname on large public chat rooms, which revealed the false domain name the hacker was hiding behind. In the end, the domain name revealed the hacker's curriculum vitae in the registration records.
Joseph Menn, author of Fatal System Error, listed "exe" as one of his more memorable cybercriminals. "exe" was like Lyon—self-taught computer wiz at a young age. "exe," whose real name was Ivan, began writing code that acted like a virus—spreading from one bot to many others. Menn has shown the increase of serious cyberattacks across the world. Denial-of-service attacks have long existed, but in recent days these attacks have targeted government and media organizations. In Estonia and the former Soviet republic of Georgia, these attacks have been used to shut down government and media websites. In the United States, stolen military secrets are among the greatest scares. Attacks like these still happen because one out of every seven computers could be bots, and most people don't realize it or know how to prevent it.
Menn adds that the technology behind the threat isn't the only thing that keeps cybercriminals from being prosecuted. The struggle is not in the streets or in the drug market, but in the world theater. In Russia and China, hackers are an asset that the governments and their Mafias can use, especially if they know how to coordinate thousands of computers to launch major attacks in the United States. The Google attacks in January 2010 are directly correlated to the Chinese government, but this isn't all. Hackers have managed to retrieve our military secrets and have the potential to hack into our power grids, and all through a technology that was designed to accomplish good.
With terrorists gaining interest in computer hackers and nations trading military secrets and draining our economy, is there hope? Menn and Lyon don't see much of it. Menn reminds us that the hacking leads to a trillion-dollar drain on the economy, especially in online commerce. Lyon sees an increasing paranoia in the security industry; it's an erosion of trust, one which decreases the effectiveness of fighting any kind of crime. Unlike weapons with specific purposes, people can do anything they want with software. Until greater and more effective security measures are created, the technology reminds us that the greatest of technology can lead to the greatest of crime.
Sunday, April 4, 2010
Blu-ray disks--an overhaul in progress
High-capacity data storage took a leap forward in 2004, when the first Blu-ray disc devices entered the Japanese market. Due to their high storage capacity and cost-effective manufacture, Blu-rays are predicted to make DVDs obsolete. But how do they work, and do they make good on their promise?
Blu-ray disks exceed the capabilities of CDs and DVDs in several ways. They store more data, making them the current standard for high-definition media storage. High-definition signals have a greater bandwidth and require more storage to preserve quality; high-definition video takes up far more space than a single CD or DVD can allow. A single-layered Blu-ray disk, on the other hand, can store more than four hours of HD video, and a double-layered disk can store twice as much without compromising quality.
In optical media, the recording layer is manufactured to have bumps and lands. When the laser shines on the bumps, the light bounces back earlier than if it hits a land, and it hits the reader. In Blu-ray disks, the bumps are smaller and closer together, and the tracks are spaced at a smaller distance. The smaller the distances and sizes, the more data that can be stored. In comparison, DVDs have a track pitch (or track separation distance) of 740 nm and a bump size of 400 nm, while Blu-rays have a track pitch of 320 nm and a bump size of 150 nm. To read this data, a blue (actually violet) laser shines at a wavelength of 405 nm. Red lasers, in comparison, shine at either 780 nm for CDs or 650 nm for DVDs. The smaller aperture (opening that lets the light through) and wavelength of Blu-ray optical readers focus precisely enough to read the data on the disk.
The construction of Blu-rays solves reading problems that CDs and DVDs still face. Blu-ray disks are placed closer to the optical reader, which reduces the chance of disk tilt and consequently the chance that light won't be reflected back at a 90˚ angle. They're also constructed so that the laser doesn't have to shine through two layers of plastic to get to the recording layer. DVDs store the data beneath the plastic, which can lead to birefringence, or the splitting of laser light into two differently-refracted beams which can't read the disk. Blu-rays store the data on top of the layer instead, preventing this phenomenon.
In addition, Blu-ray disks transfer data faster than DVDs, allowing large files to be written in less time. The disks also offer greater copyright protection; they're encoded with encryption that keeps them from being duplicated illegally. Also, the disks are built with practicality in mind; they are designed with a layer that reduces the likelihood and effects of scratches and fingerprints.
Do Blu-ray disks hold good on their promise to revolutionize video and data storage? Their specifications and design agree completely. Today, Blu-ray hasn't overhauled our current standards of optical media, but with movie titles being sold in the format and software such as Final Cut Studio offering (limited) Blu-ray usability, the technology's getting closer. With our country now using HD television as its standard, the overhaul is in sight. With companies like Pioneer announcing an optical storage medium that uses a UV laser to read a 500-GB disk, that overhaul is just a matter of time.
To learn more about Blu-ray, HD television and video, and the standards that continue to vie for attention, check out the links and learn more about the history that establishes where the technology stands today.
Blu-ray disks exceed the capabilities of CDs and DVDs in several ways. They store more data, making them the current standard for high-definition media storage. High-definition signals have a greater bandwidth and require more storage to preserve quality; high-definition video takes up far more space than a single CD or DVD can allow. A single-layered Blu-ray disk, on the other hand, can store more than four hours of HD video, and a double-layered disk can store twice as much without compromising quality.
In optical media, the recording layer is manufactured to have bumps and lands. When the laser shines on the bumps, the light bounces back earlier than if it hits a land, and it hits the reader. In Blu-ray disks, the bumps are smaller and closer together, and the tracks are spaced at a smaller distance. The smaller the distances and sizes, the more data that can be stored. In comparison, DVDs have a track pitch (or track separation distance) of 740 nm and a bump size of 400 nm, while Blu-rays have a track pitch of 320 nm and a bump size of 150 nm. To read this data, a blue (actually violet) laser shines at a wavelength of 405 nm. Red lasers, in comparison, shine at either 780 nm for CDs or 650 nm for DVDs. The smaller aperture (opening that lets the light through) and wavelength of Blu-ray optical readers focus precisely enough to read the data on the disk.
The construction of Blu-rays solves reading problems that CDs and DVDs still face. Blu-ray disks are placed closer to the optical reader, which reduces the chance of disk tilt and consequently the chance that light won't be reflected back at a 90˚ angle. They're also constructed so that the laser doesn't have to shine through two layers of plastic to get to the recording layer. DVDs store the data beneath the plastic, which can lead to birefringence, or the splitting of laser light into two differently-refracted beams which can't read the disk. Blu-rays store the data on top of the layer instead, preventing this phenomenon.
In addition, Blu-ray disks transfer data faster than DVDs, allowing large files to be written in less time. The disks also offer greater copyright protection; they're encoded with encryption that keeps them from being duplicated illegally. Also, the disks are built with practicality in mind; they are designed with a layer that reduces the likelihood and effects of scratches and fingerprints.
Do Blu-ray disks hold good on their promise to revolutionize video and data storage? Their specifications and design agree completely. Today, Blu-ray hasn't overhauled our current standards of optical media, but with movie titles being sold in the format and software such as Final Cut Studio offering (limited) Blu-ray usability, the technology's getting closer. With our country now using HD television as its standard, the overhaul is in sight. With companies like Pioneer announcing an optical storage medium that uses a UV laser to read a 500-GB disk, that overhaul is just a matter of time.
To learn more about Blu-ray, HD television and video, and the standards that continue to vie for attention, check out the links and learn more about the history that establishes where the technology stands today.
Subscribe to:
Comments (Atom)
