Saturday, May 8, 2010

Could this be the next Conficker?

FROM: Beau Smith "silly_willy_walnut_head"

TO: Mom

SUBJECT: I'm not typing a subject in this thing

--

Hey, Mom,

Earlier this week you asked if there was any kind of virus out there that could be the next Conficker. I'm glad you asked because I found something pretty serious. It's not on the level of Conficker, but Microsoft thinks it's enough to give it a high alert level.

Every Tuesday, Microsoft sends out patches to update the system, and one of them had been causing systems to crash. They researched it, and they found out that it's actually a virus called Alureon that's causing the problem. Alureon is a type of virus that controls a person's system; it's very hard to detect, and many people don't know they have it. It's able to monitor a user's Internet traffic and look for passwords, credit card numbers, and other personal information. What's even more alarming is that it doesn't show itself on a system until it causes it to crash.

We're both probably wondering the same thing: why didn't Microsoft detect this before it sent out the patch? From what I've seen, the patch shouldn't cause problems unless the system has the virus. The patch is just a way of discovering the problem. Plus, a virus like Alureon is written to keep itself hidden. When it's installed on a system, it deletes its own installation files, which goes a long way toward covering its tracks.

It's pretty overwhelming to think about all this, but preventing it is pretty straightforward. I definitely recommend buying Norton AntiVirus or Symantec AntiVirus. You can set them to scan all the files on your computer; this makes sure if there's anything in the system. They're good at isolating and removing viruses (including ones like this). One thing is to take extra caution when on the Internet; go to sites and accept emails from people you trust, and install a pop-up blocker on Firefox. Norton has a feature which I love: when you surf the Internet, it detects whether or not sites can be trusted. If a site doesn't look safe, the program gives you a warning screen.

Also, be sure to turn on your firewall and have the system check for updates every day. These keep your computer current and more secure. Plus, since Alureon looks for password data, be sure to change your computer's password and your online passwords. You can make them longer, add in a '$' or '&' character, and scramble letters so that they don't form words or phrases.

Hope that helps! I added the sites I found; they'll show you a lot about what's going on.

Love,

Beau

--

1. "Win32/Alureon"
2. "MS10-015 Restart Issues"
3. "Backdoor.Tidserv|Symantec"

Sunday, May 2, 2010

Jim Gibbons and technology

As shown in the polls and overall public opinion, Jim Gibbons has become substantially disliked as governor of Nevada. Yet much of his voting record shows a substantial bent toward technology and progress. But does it do any good?

Before and after assuming the office of governor in 2007, technology has been at the center of some of his votes. In his first State of the State address, he announced that he would allocate $170 million to improve the traffic flow and safety of several of Nevada's highways. He supported the Broadcast Decency Enforcement Act of 2005, which determined the guidelines for posting illicit content and the penalties therein. A year earlier, he voted in favor of investigating commercial space travel.

As a member of the Congressional Internet Caucus, some of Gibbons's recent work promoting awareness of and solving problems related to the Web. In early 2001, he supported the enforcement of criminal laws aimed at reducing the number of spam messages. In 2002, he voted in favor of a bill that allowed telephone giants to add high-speed Internet to their marketing. The next year, he voted to ban credit card payments to online gambling companies.1

Recently, Gibbons called for technology to be updated in the fight against illegal immigration. On April 26 of this year, he called on Barack Obama to take a stand against the problems at the border; part of his solution included updated technologies such as facial recognition software2. His stance is that since we have greater capabilities with which we can protect ourselves, it makes sense to use them.3

In his most recent State of the State address, he elaborated on the substantial problems facing the state. With the economy and housing market high on the list, technology isn't the highest place. In fact, his main mention of technology discussed a project to recycle waste and make landfills a thing of the past.4

Progress in Nevada is the same as in any state: it needs technology, but that's not all there is to it. Technology depends on money and ingenuity of the public, and his measures show a bent toward using it for the public's benefit.

--
Works cited:

1. http://www.issues2000.org/governor/Jim_Gibbons_Technology.htm
2. http://gov.state.nv.us/PressReleases/2010/2010-04-26_Immigration.htm
3. http://www.gibbonsforcongress.com/category/news/
4. http://www.lasvegassun.com/news/2010/feb/08/full-text-gov-jim-gibbons-state-state-address/

Friday, April 23, 2010

I'm David, but you can call me Google

David knocked down Goliath. Can Google knock down Windows?

Last year, the Internet search giant announced that it would be releasing Chrome OS, a platform that hopes to challenge Windows. At first glance, the comparison is like a marshmallow trying to survive a charging elephant; the logistics of it enhance the image. Windows runs most of the world's computers, and Google's field is in its search engine. Expecting it to best one of the OS giants can seem impossible.

But is it?

Google's marketing plan ties into its business practices. It earns most of its money through advertisement, and with the Chrome OS gaining interest, Google will have more opportunities to place ads. An operating system designed by the world's largest search engine will generate publicity, which will increase revenue. Even if people never buy netbooks that run on Chrome, the advertisement and attention to Google are enough.

But Google's challenger has more than twenty-five years of experience behind it. After more than two decades of development and debugging, Windows runs most of the world's computers and handles most applications. Users need an incentive to seriously consider Chrome OS. So far, it touts a more streamlined operating system; it doesn't necessarily mean that it's faster or more versatile than Windows. Even Windows at its worst allows users to do more. Chrome OS has to fill a need that Windows can't.

On the other hand, if Chrome OS is simply built upon everything that's been written (as opposed to starting from scratch), then Google can focus on meeting the needs of consumers by streamlining and making the OS more versatile.

With the OS coming soon to netbooks, the first true test will come; if Chrome performs better than Windows in this small arena, it could pose a challenge to Windows in the long run. Still, with Windows boasting twenty-five years of experience and success, David will have a formidable Goliath to knock down.

Friday, April 16, 2010

Why use Linux?

Apple promoted the revolution of desktop computers. Windows built upon the mass-production of PCs. UNIX revolutionized the graphical user interface. But one particular operating system has gained attention of its own, and all thanks to the work of a student who spearheaded its development.

Linux is the name of a kernel and operating system created by Linus Torvalds. Linux was the kernel that completed Richard Stallman's GNU Project, thereby completing a free-software operating system. Linux is free and Open Source; it costs no money to download and use, and it allows users--namely those unaffiliated with Microsoft, Apple, and other major computer corporations--to contribute their own code to the project.1

Linux is a stable platform, offering greater protection from viruses and reducing the risk of system crashes. In theory, this makes it an ideal operating system for anyone (although, in practice, there isn't much major software such as Adobe's products that can use Linux). But the fact that Linux is Open Source has allowed programmers and major corporations to use the OS's stability. Google, Amazon.com, DreamWorks, and Industrial Light and Magic have used (and continue to use) Linux, and government agencies are adopting the OS into its servers and computers.2

Ever since I got a Mac, I've learned about Linux, UNIX, and X11 to the point where I want to use the OS. Anything that provides greater stability would be great, but for me to use it, it would have to fill a need or a want. Right now, it's curiosity only, but that'll change once I use it. Part of me wants to learn how to program in C and write code for it, too, but until I know how to do it, I don't think I'd be using Linux as much as I'd like.

But based on what I've heard, I'd love to explore Linux and see what it does. Computers have always fascinated me, and seeing something different from Mac and Windows fascinates me. And if I can learn how to program code in C, Linux will be an adventure that I'd have the will and the skill to explore. For me, it's an opportunity just waiting to be taken.

--

Works cited:

1. "The GNU Manifesto"
2. "Who Uses Linux?"

Sunday, April 11, 2010

When computers attack--the threat and the response

Every technology is designed to accomplish results, but those that are designed for good can be used as a form of crime. The Internet is no exception. In January 2010, a Chinese cyberattack hacked into several Gmail accounts; the sophistication of the attack revealed that cybercrime has a more profound effect on the digital world than once believed. Mobs of programmers and hackers have successfully hacked into websites created by companies, banks, and even the U.S. government, and the same type of code used in the Google attack is only one means of creating and launching attacks.

Long before the days of WarGames (1983), the Internet has proven to be surprisingly vulnerable and open—a gateway to the hacking of computers and possibly to the destruction of the Internet itself. Today, extortion and fraud target personal consumers and political targets alike. As a result, computer programmers have taken a stand, and the battle against cybercrime on all fronts has come to light.

Cybercrime fighter Barrett Lyon explains that today's security measures aren't enough to fend off hundreds of thousands of computers programmed to accomplish the same thing all at once. Most of today's computers have common weaknesses, which make them susceptible to software that identifies them. If enough computers attack a site at the same time, then they can successfully tank a website. This is the basis behind an attack in which hackers crash websites via massive traffic overload. All of the bots—the computers which are simultaneously under the hackers' control—overwhelm the website by logging in or visiting it at the same time. This helps denial-of-service attacks, extortion, and theft succeed.

But Lyon's work has helped companies and law enforcement combat cyberattacks. His experience as a professional cybercrime fighter comes from his self-taught computer experience and his own history of hacking; as a teenager, he hacked into AOL and deleted the domain name, which took the site offline for three days and got the attention of the news and the FBI. As he understood more of the weaknesses of computers, he used his skills to divert and fight attacks.

At the start of his career, he saw signatures in a series of intense attacks; this helped him find the source of the attacks, and he even went undercover into the cybermafia as a Russian hacker to learn more. He gained the confidence of a Russian hacker, who went by the nickname "exe" (which stood for "extremist" instead of the file format "executable file"); Lyon posted the nickname on large public chat rooms, which revealed the false domain name the hacker was hiding behind. In the end, the domain name revealed the hacker's curriculum vitae in the registration records.

Joseph Menn, author of Fatal System Error, listed "exe" as one of his more memorable cybercriminals. "exe" was like Lyon—self-taught computer wiz at a young age. "exe," whose real name was Ivan, began writing code that acted like a virus—spreading from one bot to many others. Menn has shown the increase of serious cyberattacks across the world. Denial-of-service attacks have long existed, but in recent days these attacks have targeted government and media organizations. In Estonia and the former Soviet republic of Georgia, these attacks have been used to shut down government and media websites. In the United States, stolen military secrets are among the greatest scares. Attacks like these still happen because one out of every seven computers could be bots, and most people don't realize it or know how to prevent it.

Menn adds that the technology behind the threat isn't the only thing that keeps cybercriminals from being prosecuted. The struggle is not in the streets or in the drug market, but in the world theater. In Russia and China, hackers are an asset that the governments and their Mafias can use, especially if they know how to coordinate thousands of computers to launch major attacks in the United States. The Google attacks in January 2010 are directly correlated to the Chinese government, but this isn't all. Hackers have managed to retrieve our military secrets and have the potential to hack into our power grids, and all through a technology that was designed to accomplish good.

With terrorists gaining interest in computer hackers and nations trading military secrets and draining our economy, is there hope? Menn and Lyon don't see much of it. Menn reminds us that the hacking leads to a trillion-dollar drain on the economy, especially in online commerce. Lyon sees an increasing paranoia in the security industry; it's an erosion of trust, one which decreases the effectiveness of fighting any kind of crime. Unlike weapons with specific purposes, people can do anything they want with software. Until greater and more effective security measures are created, the technology reminds us that the greatest of technology can lead to the greatest of crime.

Sunday, April 4, 2010

Blu-ray disks--an overhaul in progress

High-capacity data storage took a leap forward in 2004, when the first Blu-ray disc devices entered the Japanese market. Due to their high storage capacity and cost-effective manufacture, Blu-rays are predicted to make DVDs obsolete. But how do they work, and do they make good on their promise?

Blu-ray disks exceed the capabilities of CDs and DVDs in several ways. They store more data, making them the current standard for high-definition media storage. High-definition signals have a greater bandwidth and require more storage to preserve quality; high-definition video takes up far more space than a single CD or DVD can allow. A single-layered Blu-ray disk, on the other hand, can store more than four hours of HD video, and a double-layered disk can store twice as much without compromising quality.

In optical media, the recording layer is manufactured to have bumps and lands. When the laser shines on the bumps, the light bounces back earlier than if it hits a land, and it hits the reader. In Blu-ray disks, the bumps are smaller and closer together, and the tracks are spaced at a smaller distance. The smaller the distances and sizes, the more data that can be stored. In comparison, DVDs have a track pitch (or track separation distance) of 740 nm and a bump size of 400 nm, while Blu-rays have a track pitch of 320 nm and a bump size of 150 nm. To read this data, a blue (actually violet) laser shines at a wavelength of 405 nm. Red lasers, in comparison, shine at either 780 nm for CDs or 650 nm for DVDs. The smaller aperture (opening that lets the light through) and wavelength of Blu-ray optical readers focus precisely enough to read the data on the disk.

The construction of Blu-rays solves reading problems that CDs and DVDs still face. Blu-ray disks are placed closer to the optical reader, which reduces the chance of disk tilt and consequently the chance that light won't be reflected back at a 90˚ angle. They're also constructed so that the laser doesn't have to shine through two layers of plastic to get to the recording layer. DVDs store the data beneath the plastic, which can lead to birefringence, or the splitting of laser light into two differently-refracted beams which can't read the disk. Blu-rays store the data on top of the layer instead, preventing this phenomenon.

In addition, Blu-ray disks transfer data faster than DVDs, allowing large files to be written in less time. The disks also offer greater copyright protection; they're encoded with encryption that keeps them from being duplicated illegally. Also, the disks are built with practicality in mind; they are designed with a layer that reduces the likelihood and effects of scratches and fingerprints.

Do Blu-ray disks hold good on their promise to revolutionize video and data storage? Their specifications and design agree completely. Today, Blu-ray hasn't overhauled our current standards of optical media, but with movie titles being sold in the format and software such as Final Cut Studio offering (limited) Blu-ray usability, the technology's getting closer. With our country now using HD television as its standard, the overhaul is in sight. With companies like Pioneer announcing an optical storage medium that uses a UV laser to read a 500-GB disk, that overhaul is just a matter of time.

To learn more about Blu-ray, HD television and video, and the standards that continue to vie for attention, check out the links and learn more about the history that establishes where the technology stands today.

Sunday, March 28, 2010

BIOS setup and drive devices

I don't have a PC anymore (I've been a Mac user since 2006), but last year, a neighbor gave us a really old laptop, so I pulled the settings from it.

BIOS (version A04) settings:

HDD size: 20 GB (20005 MB)
Boot order: CD drive, then HDD, then 3.5" floppy drive.
System primary password for the HDD: Disabled



DMA:

DMA, or direct memory access, is a transfer mode for data that bypasses the CPU. Data is sent directly from the drive to the memory, freeing up processor power and capacity.

Low-level formatting:

Low-level formatting is a process which assigns tracks and sectors to disks. As opposed to high-level formatting, which creates partitions based on the information that's already there, low-level formatting provides that information. This is the type of formatting that takes place before drives are put onto the market.

SATA:

Serial ATA is today's standard for computer drives. SATA devices transmit data bit-by-bit across the cable in real time. Because of its speed (up to 1.5 GB/sec vs. PATA's 133 MB/sec) and method of transferring data, most drives use SATA.

ATA:

ATA, or Advanced Technology Attachment, is the foundation for today's data transfer between drives. It's a series of standards that determines how drives communicate with the rest of the system. ATA standards have changed over time; the earliest editions are now obsolete, but some early editions are still being used.

IDE:

IDE, or Integrated Drive Electronics, was the foundation of ATA. It allowed data transfer without connecting a device directly to the motherboard, but through a connector that transfers data. The first IDE devices entered the market in 1986, and Enhanced IDE (EIDE) devices became what we now know as PATA.

Friday, March 12, 2010

History of computer memory

Memory is one of the few computer technologies with more than a century of history behind it. One of the earliest known forms of memory began with one of the earliest calculating machines. In the 19th century, Charles Babbage used punch cards to input data into his analytical machine. Because this was read-only memory, it didn't write data or read any data apart from what was punched into the cards.

Computers stored memory this way until 1932, when Gustav Tauschek's drum memory began to replace punch cards as primary memory. Drum memory operated much as today's hard drives do--rotating along an axis to write and record data. Drum memory debuted late after its invention: it didn't gain widespread use until after World War II.

As drum memory continued to replace punch cards, another form of memory arose to challenge them both. Beginning in 1947, magnetic core memory relied on magnetic polarity to store binary information. In a pattern shaped like today's computer processors, wires ran through the ring-shaped cores and polarized the rings. The first successful core was completed and put to use in 1953, thereby rendering punch cards and drum memory as secondary memory for the rest of their lives.

But it wasn't long before even this was rendered obsolete, as the 1960s saw the precursors of today's memory technologies. In 1966, Hewlett-Packard designed and sold a computer with memory printed onto integrated circuits. Two years later, IBM patented the first random access memory, the same type of memory used in today's computers. The year after that, Intel established itself in the market by introducing what was then the most current memory capacity--one kilobyte.

This remained the industry standard through early 1975, when the Intel-based Altair personal computer entered the marketplace. Later that year, the computer's memory expanded to four kilobytes. Yet, in 1984, the next industry standard set a precedent for the personal computers we know today. Steve Jobs and Steve Wozniak introduced the Apple personal computer, which used 128 KB of RAM and led the way toward the 1 MB memory chip.1

The early 1990s introduced DIMMs, or dual inline memory modules--the RAM that today's most current computers use. The predecessor was the SIMM, or single inline memory module, which was the standard through the 1980s. SIMMs relied on pins on both sides running through a single contact, but DIMMs separate those pins into their own contacts, thereby increasing system performance. This technology received an even greater boost in 1997, when SDRAM allowed the RAM to speed up and keep time with the system clock. Since the new millennium, the RAM that most computers use is DDR SDRAM, so named because it kept time with the system clock and accepted twice as much data per second as the SDRAM of 1997.2

Today, this technology continues to grow ever still. Many personal computers come with 4 to 6 GB of memory installed, and others such as Apple's 27" iMac can support as much as 16GB. As amazing as these standards are, we'll someday look on today's standards as inferior in comparison. But no matter how obsolete the technology of each year becomes, it's a reminder of science's never-ending goal of making each technology better than before. With nearly two centuries of invention and innovation standing behind it, computer memory makes this goal a case in point.

-----

Works cited:

1. History of Computer Memory.
2. Andrews, Jean. CompTIA A+ Guide to Managing and Maintaining Your PC. 7th ed. Boston: Course Technology, 2010.

Sunday, March 7, 2010

Assistive technologies

In today's world, computers have also worked wonders for people with disabilities. Assistive Technologies, a company that develops and distributes technology in this field, is led by Don Dalton. Dalton, who is a quadriplegic, runs his company by talking into a headset. Many of the products he sells have one goal--to make computer use easy by using modern technology as an advantage.

One of the products his website distributes is Dragon NaturallySpeaking, a speech-interpreting program that types documents and performs computer commands just by talking into a headset. The software can replace a keyboard and mouse altogether, so it works as a timesaver for anyone and a lifesaver to those with disabilities. The preferred edition of the software costs $199 through Assistive Technologies, and it runs only on Windows.

Another product is JAWS, a screen-reading software for the visually impaired. The program displays the computer's screen using a speech synthesizer and Braille output. The website sells the product for just under $1,100.

Friday, February 26, 2010

Computer processor terms

FROM: Beau Smith

TO: Mom

SUBJECT: I'm not typing a subject in this thing



Hey, Mom,

Sorry it's taken so long to write back, but I'm ready to help you out with your computer. Processors, or CPUs, are pretty easy to understand--they're basically like the heart of your computer because they take in and send out data across the entire system. I like comparing it to that because it'll help some of the terms make sense.

Processor frequency is like the heart rate of the computer. It tells you how many instructions it can complete every second. The higher the frequency, the faster your computer will run. It's already impressive enough, though, that a computer can complete millions of those every second.

When we talk about word size, we're talking about how large a piece of information is. The larger a piece of information is when it enters a processor, the more information it stores.

Overclocking is basically running the risk of a computer going into cardiac arrest. Remember how you said that if I worked out too hard on the treadmill--if I got my heart rate above a safe limit--that I could possibly have a heart attack? It's the same thing; if a computer is forced to operate at a higher frequency than it was designed, it overheats the processor and can damage it. People actually do overclock their systems, but many of them are gamers and hobbyists. With your work, overclocking isn't necessary (much less recommended).

Datapath is a tricky one to define, so stick with me for a moment. :) A processor is divided into several units. Some control the instructions that go in and out of the processor, and others actually calculate and process the instructions. A datapath is basically the route data takes when it's being calculated and processed. A datapath is like the part of the brain that thinks about something and understands it (as opposed to acting on that particular something).

SRAM, or static RAM, is memory in the processor. There's a difference between this and the type of RAM that you can add to a system. The RAM you can add to a system is typically called memory, but it's also called DRAM (dynamic RAM). This kind of RAM doesn't store data very long; that's why it can take so long to work with big applications even after you've used them for several hours. But SRAM, which stays in the processor, can hold onto the data as long as the computer has power; this is important because it "anticipates" what the processor will need down the line. Put another way, DRAM is like saying, "I'll take instructions as they come," but SRAM is like saying, "I'd better keep this in mind because we might need to know it later."

Memory caches are temporary data storage. Any kind of RAM is a form of temporary data storage, but memory caches are designed specifically for processors. The kind that is built into the processor itself is the primary cache. This makes the system run as fast as possible. It's like the human heart; it gives the body the most life. External caches are a part of the processor but aren't directly built into the processor itself. It helps give life, like a deep breath or a glass of water. It helps the heart and the rest of the body, even though it's not directly part of the body.

*Exhales* Wow, I know we've gone over a lot, but I hope this helps! If you have any more questions, just let me know!

Love,

Beau

Sunday, February 21, 2010

Powerstax PLC

FROM: Beau Smith <"silly_willy_walnut_head">

TO: Mom

SUBJECT: Why do I have to type a subject in this thing?



Hey, Mom,

I got your e-mail. You said that your office was looking to upgrade their power system. I found something that'll help you guys find out what you need. A company called Powerstax PLC in the UK sells bulk power supplies, but one of their distributors is Peak-to-Peak Power near Tampa. The module right here has the best wattage and highest efficiency; in other words, a few of these will power every computer in the office without costing an arm and a leg. These are about $640 a piece for five units.

If you're also looking at other products, tell your IT guys to check out their website. Tell them the company sells AC-DC power converters and DC-DC converters (if numbers matter, tell them there are 27 of those alone). According to the website I found the company through, they're a well-rated company to buy through. That last website I showed you can give you guys a lot of information about the power supply you've got as long as you type in a brand name, but I don't recommend going through the Powerstax website to find out info unless you're using their products.

Hope that helps with your power situation in the office! I'll bet the thunderstorms in Tampa are exciting. I wish I could be there to watch.

Love you,

Beau

Friday, February 12, 2010

Trading Competence for Cash

Can the outsourcing of customer support ever be justified? Most computer users don't think so. International customer support for computer users continues to leave customers stranded and frustrated. Even the biggest companies in the world would do well to know the reasons why.

Even as technology spreads across the world, it doesn't solve the problems. Communication is one of those problems. People who don't know or don't grasp the English language present an obstacle. That's not to excuse it, but it's still an obstacle. When the company and the customer can't communicate well, it justifies the complaints.

A qualified technician needs good communication and solid knowledge of computers. This holds true for any technician, whether local or overseas. This forum thread illustrates several users' experiences with HP, its products, and its local and overseas technicians.

Make no mistake: the issue is competency. The goal of any good company is to provide the best products and the best services, and its employees need the skills to make it happen. The demand for good computers and good customer service underscores this fact. Every penny spent toward running a good business earns money down the line; a loyal customer base will come back for more products and tell others that the company does quality work.

But trading competence for cash produces the opposite effect. It reduces a long-term customer base and downgrades superior products. In essence, it creates a paradox: the company loses a dollar to save a dime. Even in today's economy, outsourcing overseas tends to promote a loss of quality control, and the complaints that result from overseas tech support prove it.

One of the basic unspoken principles of economics will never change: Everyone loves a good product. If I ran my own computer company, my goal would be simple: satisfaction of a job well done. Everything I do would work toward that goal, and I would want to hire competent workers who share that goal. This justifies the next unspoken rule of economics: Good customer service maintains a business's integrity.

Even though these rules are unspoken and unofficial, everything I've seen with customer service has taught me that any good business relies on both these rules. There is no substitute for competent customer service. Anything else is less.

Wednesday, February 3, 2010

Computer ad

These questions are taken from page 33 of CompTIA A+ Guide to Managing and Maintaining Your PC, seventh edition. The ad is on page 34.


1. What is the system bus called? What is the system bus frequency?
The system bus is the ASUS P6T Deluxe, and its frequency is 1066FSB.

2. What is the frequency for the processor?
The processor's frequency is 2.66 GHz.

3. What is the brand of the processor?
The processor's brand is Intel.

4. How much RAM is installed?
6 GB of RAM is installed.

5. What type of expansion slot is used for the video card?
A PCIe slot is used for the video card.

6. What type of interface does the hard drive use?
The hard drive uses a SATA Revision 2.0 interface.

7. How much data can the hard drive store?
The hard drive can store, theoretically, 1 TB of space. (Actual formatted capacity less.)

8. What is the brand of the motherboard?
The motherboard is made by ASUS.

9. What type of optical drive is used?
An LG optical drive is used.

Sunday, January 31, 2010

Seven things you don't know about me (until now...)

How's it going?

Space for Rent here. I'm a computer science major at TMCC and outspoken geek. Here's a short list of things that have gone unknown until now.

1. I've been around computers since 1994, and I've been a Mac user since 2006.

2. I'm an author and novelist. I'm hoping to publish my first book sometime this year.

3. I've been photographing thunderstorms in Reno since 2004. (I still have yet to go to Arizona, Florida, or the Midwest.)

4. I was diagnosed with cancer in 2002. Ever since the surgery that same year, the cancer hasn't returned.

5. I've been a musician and composer for most of my life.

6. Some of what I've learned about computers has come through the show "24." (I've confirmed that a lot of the computer jargon they use is based in reality, actually!)

7. My favorite OS is Windows XP.



Until next time,

This space is still for rent!