#Post Title #Post Title #Post Title

11 New Models Car Market Ramaikan Indonesia Year!

Bangkok, KompasOtomotif - Cars on display at the Bangkok International Motor Show (BIMS) a reference unofficial new models that will be marketed in Indonesia. Understandably, as the largest automobile production base in ASEAN, Thailand is still a mecca. At BIMS which still continues today (26 April-6 April 2012), from a total of 650 vehicles on display, 11 of the new models antaranyaadalah be marketed in Indonesia. Honda Honda, for example, displaying two new models, the Brio and the all-new Civic and will be launched in the second half of this year. Import prime Brio been submitted PT Honda Prospect Motor (HPM) and conducted in the last quarter of this year followed by the local assembly from 2013. Civic fresh new look with a design that completely changed from the previous model. The design is now more slanted headlights, while the combination lamp is designed with a trapezoidal shape, no longer round. Its dimensions also look wider and is offered with two engine options, petrol i-VTEC engine, the 1.8-liter-powered 140 PS and 2.0-liter 154 PS. Toyota Toyota has confirmed that it will launch the Camry facelift, including a hybrid variant choice in Indonesia, next week (April 2012). Camry hybrid Prius will accompany the market launch of new environmentally friendly kedaraan in Indonesia. Toyota hopes that, by targeting the upper segment, the technology of "green" can be more readily accepted in the market. Largest car manufacturer in Japan and Thailand also sell Toyota 86 with a tag of 2.49 to 2.74 million baht (U.S. $ 738.1-U.S. $ 812.2 million). This latest Kupe sports sedan will be the first Toyota debut in the sports car market in Indonesia and is planned to be launched in the second quarter of this year. Suzuki and Mitsubishi mendebut their two new compact cars, the Swift and Mirage. Both cars are the result of the "Eco Car" launched by the Government of Thailand. PT Krama Yudha Tiga Berlian Motors (KTB) has confirmed that it will bring the Mirage this year. While Suzuki still thinking about it. "We only sell (import) Swift of Thailand, but the 1.5-liter engine instead of the 1.2-liter," said Davy Tuilan, Marketing Director of PT Suzuki Indomobil Sales, when contacted KompasOtomotif, Friday (30/03/2012). American In addition to the dominance of Japan, two brands of the United States, the Chevrolet and Ford, not to mention introducing new models and be marketed in Indonesia this year. PT Ford Motor Indonesia (FMI) to make sure the launch of two new models, the All-New Ranger All-New Focus and Fiesta plus 1.5 liters. "For All-New Focus after the production in Thailand, August (2012). While Fiesta (1.5 liters), we indeed interested, but not yet certain," said Good Susanto, Managing Director of FMI. Chevrolet mendebut two new models, the all-new Trail Blazer in the medium SUV segment. Plus Sonic to fill the small sedan segment (segment B), including the hatchback model. Interesting, Sonic is positioned as a replacement for the Aveo, besides the model hatchback, sedan will also be launched in Indonesia in the second half of this year. The more crowded and lots of choices dah!
[ Read More ]

[ Read More ]

[ Read More ]

A Beginner’s Guide to Overclocking Your Intel Processor

If you want to squeeze every last ounce of processing power out of your new computer or aging system, overclocking is a great—if slightly nerve-racking—option. Here are some simple guidelines for safely overclocking your CPU. Photo by blazor85. Simply put, overclocking your CPU involves running your processor at a faster speed than was intended out of the box. While overclocking, at its core (no pun intended), can be quite simple, there's a bit more to it than just tweaking one setting. The main setting that determines your CPU speed (known as your Base Clock) also affects your RAM speed, so there's a bit of tweaking required to get the right balance. You'll also have to tweak a few voltage levels, because without enough power, the CPU can't run fast enough. However, higher voltage also means higher temperatures, so you need to be careful not to overheat your CPU, which can lower the life, not to mention fry it completely if you're not careful. This particular guide was written with the Intel i7-930 "Bloomfield" processor on a Gigabyte GA-X58A-UD3R motherboard, though most of the basic ideas behind it can apply to other, similar processors (such as the Lynnfield i5s and i7s) and other compatible motherboards. I highly recommend doing extra reading, even if you have the same gear as me, and especially if you have different gear. No two systems are the same, and overclocking is not something to be taken lightly. I wrote this guide from my experience, but I had a lot of help from sites like overclockers.com, overclock.net, and hexus.net, to name a few. At the end of this guide I've provided links to a few specific guides that should provide more information, and I highly suggest you check them out. Also, do some forum browsing and googling for more information on subjects you don't understand or that we don't cover in depth. All that said, this guide should put the process into basic enough terms that those unfamiliar with overclocking can get a general feel of what it entails and how to get started overclocking their systems. Why Overclock? There are any given number of reasons one would want to overclock, but in general, you'll get the most benefit if you're using your computer for more CPU intensive tasks, such as gaming or video encoding. Of course, many overclockers simply enjoy the thrill of playing around with their systems and pushing it to the limit (or at least pushing it further), so if that's reason enough for you, then go for it. Know, however, that overclocking can be dangerous, and can shorten the life of or permanently damage some of your components if something goes wrong, so don't start tweaking unless you're prepared to face the potential consequences. Note: I'm being particularly doomsday here to emphasize that things could go wrong. If done correctly, overclocking is generally a pretty safe endeavor (I've never damaged my gear), but if you're not willing to risk damaging your processor, you may want to skip it. Glossary While this is not an exhaustive list, these are the settings we'll be tweaking in this guide. The jargon related to your clock speed is as follows: The Base Clock affects both your CPU frequency and RAM frequency (among other things) as described below. This is one of the main settings we'll be focusing on in this tutorial. Coupled with the base clock, your CPU multiplier decides your final CPU frequency. It works like this: if your base clock is, say, 133 MHz (the default of the i7) and your multiplier is x21, your CPU frequency will be 133 MHz x 21 = 2.8 GHz. The RAM multiplier is very similar. The same base clock of 133 MHz coupled with a RAM multiplier of 8 will give you about 1066 MHz. Uncore frequency is another multiple of the base clock, and basically affects the speed of everything that isn't the core. Your only goal with the uncore frequency is to keep it at a 2:1 ratio with your RAM speed, so any time you edit your RAM multiplier, you'll have to double check your uncore frequency to keep your system stable. As far as voltages go, these are the ones you'll want to be familiar with: CPU Vcore is directly related to your CPU frequency. You will likely have to raise it as you raise your CPU multiplier. QPI or Vtt voltage will help you keep your system stable as you raise the base clock. DRAM or VDIMM voltage is the voltage provided to your RAM. For this guide, you probably won't have to raise this, but if you do, make sure it doesn't go more than .5 volts above your QPI/Vtt voltage. IOH voltage is the voltage supplied to your PCI cards. You'll probably set it once and leave it there, depending on how many graphics cards you have. What You'll Need A Windows machine. This guide is for Windows, though if you have a Hackintosh or a Linux machine with a Windows partition, that should do fine—we're just going to do our stability testing in Windows. An Intel i5 or i7 processor. Again, this guide was written with the i7-930 Bloomfield, though Lynnfield i5s and i7s should follow similar processes. The Clarkdale i3s and i5s are a bit more complicated as the GPU is also thrown into the mix, so I suggest you do some extra reading on guides made specifically for those processors (This guide , for example, has some helpful notes). Though again, do some outside reading even if you have the Lynnfield processors, as each model has its own idiosyncrasies. Prime95. This program was made to calculate prime numbers, and has become the standard for stress testing your CPU. It will consistently load your CPU to 100%, helping you decide whether it's stable and cool enough to run regularly. Update: Lots of readers have also recommended a program called LinX over Prime95. I find that LinX sometimes passes tests that Prime95 fails, so I prefer Prime95—but LinX is a good way to speed up the process for the first few rounds of stress testing, since it doesn't take nearly as long. Previously mentioned RealTemp. This program will help you monitor your CPU temperature as you run Prime95, so you know if your CPU is getting too hot. Previously Mentioned Memtest86+. This is a small boot CD that we'll run just to make sure your memory is stable once you raise the base clock. A good cooling system. If you plan on overclocking more than just a little bit, you'll want to get something other than the stock Intel heatsink and fan. This is the heatsink and fan I use, and it's served me quite well. Of course, if you have the cash, you can spring for water cooling and get even lower temperatures (and thus higher overclocks). Ask around, read at reviews on Newegg, and think about your overclocking goals to decide what kind of cooling you want. Full size The Process Here I will outline the basic steps to getting a stable overclock on your system. This particular process is longer than what most guides suggest, but I've found that it's much easier to get everything stable if you only change one thing at a time, rather than setting your clock speed and guessing which voltages you need to raise. This process is pretty time consuming, so find something to do while you test, because you'll be doing a lot of restarting and waiting around. Editing the BIOS is fairly easy. To change a setting, just highlight it with the arrow keys and press enter to see your BIOS' predefined values for that setting. Alternatively, you can just highlight a setting and start typing in a number, and your BIOS will usually show you the predefined values in a small window to the left. Some settings cannot be edited by default, and you may need to look at the option above them to enable tweaking of that feature. Note also that some settings will be in submenus, usually defined by having a "press enter" option instead of a value next to them. We're going to spend most of our time in what my BIOS calls the "MB Intelligent Tweaker", though your motherboard may call it something different. In addition, your pictures may not look exactly like mine, but they should give you a general guideline as to what is what. Apologies for the slightly bad quality pictures; it's difficult to take screenshots of the BIOS so they are just photos. You can click on any of them to get a closer look. Set a Goal and Prepare Your BIOS Since we can't just jump right up to our desired speed, it's important to have some sort of goal in mind before we begin. Think of why you're overclocking and what kind of final speeds you'd like to see out of your computer: whether it's just a little bit (say, bringing a 2.8 GHz i7 up to 3.5 GHz) or whether you'd like to crank it up pretty high (say, up to 4 or 4.2 GHz). Keep in mind that these numbers are relative; if you've got an i5 or an i3, your goals will be a bit lower. Now, with that goal in mind, think of what base clock you'd need to make that happen with a multiplier of x19 or x21, since your multipliers will generally be in that area, and x20 tends to be unstable. So, for example, my goal was to make it to 4 GHz, which is pretty lofty, but doable. Thus, the base clock I'm aiming for would be around 210 MHz—I could either go with 190 MHz with a multiplier of 21 or 210 MHz with a multiplier of 19 to reach 3.99 GHz. I opted for the former, but made 200 MHz my base clock goal just so I'd have a bit of wiggle room at the end. If you've messed with your BIOS at all, reload the default settings before beginning. If there are any settings you need enabled (like legacy drivers for USB keyboards), set those now. There are also a few settings we'll want to disable before we start overclocking. First, you'll want to disable anything related to Turbo Mode as this will give you a higher clock speed than you define, and we want to know exactly where our processor is, speed-wise. Also turn off any power-saving settings such as EIST, C1E, and other C-state support for now. I would also turn off Load Line Calibration. Both the power-saving settings and Load Line Calibration are a bit controversial—many people say you can turn these back on after you're done overclocking, while others prefer not to. Do some extra reading on the debate and make a decision for yourself—for what it's worth, I've got them all re-enabled right now and I have yet to have issues, but my overclock is still young, so that may change. Lastly, turn to your RAM. We will not be overclocking RAM in this guide, but we will still need to tinker with it a bit. A lot of times, your RAM doesn't even run at stock by default, so we'll make sure we get them up to speed. Before starting, we'll need to set your RAM timings according to your manufacturer. To find your manufacturer's recommendations, find your RAM on Newegg (for example, this is mine) and click the specifications tab. Under "timing", you'll see a string of numbers, such as 9-9-9-24-2N. This refers to the CAS Latency Time, tRCD, tRP, tRAS, and Command Rate, respectively. So go into your BIOS and enter in your manufacturer's numbers for these features (you may need to do it separately for each channel). Also make sure you set your DRAM voltage to that defined by your manufacturer, in this case 1.5. When you're finished, your screen should look something like this: If everything looks good, it's time to start adjusting the base clock. Isolate and Stabilize Your Base Clock The first step is to make sure your base clock can reach your desired goal. The best way to do this, I've found, is by turning everything else down so we can focus on one thing at a time. Turn your CPU multiplier down to something low, like x12, and put your RAM multiplier on its lowest setting, usually 6 (sometimes also displayed as 2:6). Since we've edited the RAM multiplier, we'll also have to change the uncore frequency, so set it to x12 (double your RAM multiplier of 6). This will keep your system stable. Now go back to the tweaking screen and go down to your voltage section. Set your IOH core to 1.3 if you have one PCI card, or 1.35 if you have two (some motherboards don't have this setting, in which case just skip this step). Set your Vcore and QPI/Vtt voltages to their normal numbers (which should be listed next to them); just be sure to take them off Auto. Everything else can be left as-is for now. Lastly, increase your base clock from its normal setting by 10 or 20 MHz. Then, save your BIOS settings and exit, restarting your computer. Boot into windows and start up CoreTemp. At this point, lots of people also like to run previously mentioned CPU-Z to make sure their settings were applied correctly, though I find that it just gets in the way. It's up to you. Next, start up Prime95 (or your other stress testing program of choice) and select "Just Stress Testing" if prompted. If the Torture Test window doesn't automatically come up, go to Options > Torture Test and set it to do a Blend test. Hit OK and let it run for about five minutes. Your temperatures probably won't be too high in this stage, but keep an eye on them anyway. The hottest temperatures you want to reach while running Prime95 are up to you, though I like to keep it under 85 degrees or so (80 if I can help it). Full size After five minutes, if Prime is still running, go ahead and restart back into the BIOS. Bump up your base clock by another 10 MHz and run through the process again. If Prime95 threw you an error, or if your computer froze, restarted, or gave you the BSOD (Blue Screen of Death), either during Prime or before it even fully booted, go back into your BIOS and raise the QPI/Vtt voltage by one increment and run the test again. Repeat this process until you reach your base clock goal, or until you reach unsafe temperatures (which again, you're unlikely to do in this stage). If you're running your desired base clock at and it's handling a few minutes of Prime95 without error, run it for an hour or so instead of five minutes, raising the voltage if need be. Once it's stable for an hour of Prime95, move on to the next step. Stabilize Your Memory Now that your base clock is where you want it to be, you'll want to get your RAM running at or near stock speeds. Your stock speed will be listed on the Newegg page for your RAM, generally in the title after its DDR type. My RAM's speed, for example is 1600 MHz (listed as "DDR3 1600"). At this point in my testing, my base clock was 200, so setting my RAM multiplier to 8 would give it the stock speed of 1600 (200 base clock x 8 = 1600). You may not be able to get it exact with your chosen base clock, so just get it as close as you can for the purposes of this (remember, we're not going to talk about overclocking RAM today, so the goal is to just get it close). Don't forget to reset your uncore frequency as well, to make sure that it is twice your RAM speed. Once everything is set, reboot your computer with the Memtest86+ disc inserted (make sure your BIOS is set to boot from CD before booting from the hard drive). As it starts up, choose option 1 and it will automatically begin testing. It should make it through one cycle without a problem since we're not overclocking. If it throws you an error, try raising the DRAM voltage (but be careful not to raise it more than 0.5 volts above your QPI/Vtt value). If it still isn't stable, you might be one of the unlucky few that can't run your RAM at its stock speeds, so go back and lower the multiplier again (and uncore frequency) to see if that helps. If you're still having problems after a bit of tweaking, you may have a defective stick of RAM that went unnoticed until now. Once your RAM passes Memtest, reboot into your BIOS for the last step. Adjust Your CPU Multiplier Now it's time to get your CPU running at your desired frequency. Everything else at this point should be stable, so you just need to adjust the CPU multiplier and the Vcore voltage. Leave the Vcore where it is (it should still be running at the "normal" speed defined by your board), and raise your multiplier by a few levels. Run the Prime95 test again as previously described, though keep a closer eye on your temperatures as they may start to get pretty high in this phase. If everything checks out after a few minutes, restart into your BIOS and raise your multiplier again. If the test fails, reboot into your BIOS and raise your Vcore by one increment and run the test again. If your temperatures get too high (into the mid to high 80s), you either need to get a better cooling system or settle for a lower overclock. If you reach your desired clock, then you're in the home stretch. You may have to do a bit of fiddling at this point to get to the clock you want (200 MHz was my goal base clock, but I'm actually running it at 190 right now to get a 190 x 21 = 3.99 GHz clock speed). Once everything looks good, you'll need to put it through some more rigorous testing: run Prime95 for anywhere from 6 to 12 hours, and see if it passes. If not, raise the Vcore a bit more and try again. Once you can run Prime95 for 6 hours or more without an error, you've got yourself a pretty stable overclock. I like to test it in more practical situations as well, just to make sure—i.e., if you're a gamer, play a little Crysis; if you're a video encoder, throw a Blu-Ray at Handbrake and see if it runs without error. If everything checks out, then congratulations! You have successfully overclocked your system. There's plenty more to overclocking, but this guide should be enough to keep any beginner busy for awhile. Remember to do some outside reading; the sites mentioned at the beginning of this article have a ton of information contained within. In particular, I had a lot of help from these specific guides, and I recommend you take a look at them as well: 3 Step Guide to Overclock Your Core i3, i5, or i7 - Overclockers.com i7 Overclocking for Beginners - Hexus.net Ultimate Core i7 Overclocking Guide - MaximumPC Guide to Overclocking the Core i7 920 or 930 to 4.0GHz - Overclock.net Some of these are processor-specific, but you should be able to find similar guides for yours with a bit of googling. I highly recommend looking at these anyways, as well as other articles they link to, because there's some really good information in there about some of the more contested settings, as well as the background information on how it all works. A few things to keep in mind: Everyone's system is different. Overclocking forums can be extremely helpful as far as general information, but just because someone clocked their system to 4.5 GHz does not mean you can do the same, even if you have nearly identical gear. Sometimes, these posts can be helpful for pointing out other settings we haven't covered, but in general, this method will get you where you need to go. Of course, everyone does things a little bit differently, and some of you have probably overclocked your fair share of systems, so feel free to share your thoughts, methods, and tips in the comments.
[ Read More ]

The History of Computers

"Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention. This series covers many of the major milestones in computer history (but not all of them) with a concentration on the history of personal home computers. Ads Research History www.Questia.com/History Access full-text academic books and journals. Research online @ Questia Computer www.Bayt.com Middle East Job Opportunities. Upload your Resume now: Free! Business Writing CorporateTrainingMaterials.com Instructor led training material to teach business writing skills. Computer History Year/Enter Computer History Inventors/Inventions Computer History Description of Event 1936 Konrad Zuse - Z1 Computer First freely programmable computer. 1942 John Atanasoff & Clifford Berry ABC Computer Who was first in the computing biz is not always as easy as ABC. 1944 Howard Aiken & Grace Hopper Harvard Mark I Computer The Harvard Mark 1 computer. 1946 John Presper Eckert & John W. Mauchly ENIAC 1 Computer 20,000 vacuum tubes later... 1948 Frederic Williams & Tom Kilburn Manchester Baby Computer & The Williams Tube Baby and the Williams Tube turn on the memories. 1947/48 John Bardeen, Walter Brattain & Wiliam Shockley The Transistor No, a transistor is not a computer, but this invention greatly affected the history of computers. 1951 John Presper Eckert & John W. Mauchly UNIVAC Computer First commercial computer & able to pick presidential winners. 1953 International Business Machines IBM 701 EDPM Computer IBM enters into 'The History of Computers'. 1954 John Backus & IBM FORTRAN Computer Programming Language The first successful high level programming language. 1955 (In Use 1959) Stanford Research Institute, Bank of America, and General Electric ERMA and MICR The first bank industry computer - also MICR (magnetic ink character recognition) for reading checks. 1958 Jack Kilby & Robert Noyce The Integrated Circuit Otherwise known as 'The Chip' 1962 Steve Russell & MIT Spacewar Computer Game The first computer game invented. 1964 Douglas Engelbart Computer Mouse & Windows Nicknamed the mouse because the tail came out the end. 1969 ARPAnet The original Internet. 1970 Intel 1103 Computer Memory The world's first available dynamic RAM chip. 1971 Faggin, Hoff & Mazor Intel 4004 Computer Microprocessor The first microprocessor. 1971 Alan Shugart &IBM The "Floppy" Disk Nicknamed the "Floppy" for its flexibility. 1973 Robert Metcalfe & Xerox The Ethernet Computer Networking Networking. 1974/75 Scelbi & Mark-8 Altair & IBM 5100 Computers The first consumer computers. 1976/77 Apple I, II & TRS-80 & Commodore Pet Computers More first consumer computers. 1978 Dan Bricklin & Bob Frankston VisiCalc Spreadsheet Software Any product that pays for itself in two weeks is a surefire winner. 1979 Seymour Rubenstein & Rob Barnaby WordStar Software Word Processors. 1981 IBM The IBM PC - Home Computer From an "Acorn" grows a personal computer revolution 1981 Microsoft MS-DOS Computer Operating System From "Quick And Dirty" comes the operating system of the century. 1983 Apple Lisa Computer The first home computer with a GUI, graphical user interface. 1984 Apple Macintosh Computer The more affordable home computer with a GUI. 1985 Microsoft Windows Microsoft begins the friendly war with Apple. SERIES TO BE CONTINUED
[ Read More ]

Machine of the Year The Computer Moves In

By the millions, it is beeping its way into offices, schools and homes By Otto Friedrich. Reported by Michael Mortiz/San Francisco, J. Madeleine Nash/Chicago and Peter Stoler/New York WILL SOMEONE PLEASE TELL ME, the bright red advertisement asks in mock irritation, WHAT A PERSONAL COMPUTER CAN DO? The ad provides not merely an answer, but 100 of them. A personal computer, it says, can send letters at the speed of light, diagnose a sick poodle, custom- tailor an insurance program in minutes, test recipes for beer. Testimonials abound. Michael Lamb of Tucson figured out how a personal computer could monitor anesthesia during surgery; the rock group Earth, Wind and Fire uses one to explode smoke bombs onstage during concerts; the Rev. Ron Jaenisch of Sunnyvale, Calif., programmed his machine so it can recite an entire wedding ceremony. In the cavernous Las Vegas Convention Center a month ago, more than 1,000 computer companies large and small were showing off their wares, their floppy discs and disc drives, joy sticks and modems, to a mob of some 50,000 buyers, middlemen and assorted technology buffs. Look! Here is Hewlett-Packard's HP9000, on which you can sketch a new airplane, say, and immediately see the results in 3-D through holograph imaging; here is how the Votan can answer and act on a telephone call in the middle of the night from a salesman on the other side of the country; here is the Olivetti M20 that entertains bystanders by drawing garishly colored pictures of Marilyn Monroe, here is a program designed by The Alien Group that enables an Atari computer to say aloud anything typed on its keyboard in any language. It also sings, in a buzzing humanoid voice, Amazing Grace and When I'm 64 or anything else that anyone wants to teach it. As both the Apple Computer advertisement and the Las Vegas circus indicate, the enduring American love affairs with the automobile and the television set are now being transformed into a giddy passion for the personal computer. This passion is partly fad, partly a sense of how life could be made better, partly a gigantic sales campaign. Above all, it is the end result of a technological revolution that has been in the making for four decades and is now, quite literally, hitting home. Americans are receptive to the revolution and optimistic about its impact. A new poll* for TIME by Yankelovich, Skelly and White indicates that nearly 80% of Americans expect that in the fairly near future, home computers will be a commonplace as television sets or dishwashers. Although they see dangers of unemployment and dehumanization, solid majorities feel that the computer revolution will ultimately raise production and therefore living standards (67%), and that it will improve the quality of their children's education (68%). [*The telephone survey of 1,019 registered voters was conducted on Dec. 8 and 9. The margin of sampling error is plus or minus 3%.] The sales figures are awesome and will become more so. In 1980 some two dozen firms sold 724,000 personal computers for $1.8 billion. The following year 20 more companies joined the stampede, including giant IBM, and sales doubled to 1.4 million units at just under $3 billion. When the final figures are in for 1982, according to Dataquest, a California research firm, more than 100 companies will probably have sold 2.8 million units for $4.9 billion. To be sure, the big, complex, costly "mainframe" computer has been playing an increasingly important role in practically everyone's life for the past quarter-century. It predicts the weather, processes checks, scrutinizes tax returns, guides intercontinental missiles and performs innumerable other operations for governments and corporations. The computer has made possible the exploration of space. It has changed the way wars are fought, as the Exocet missile proved in the South Atlantic and Israel's electronically sophisticated forces did in Lebanon. Despite its size, however, the mainframe does its work all but invisibly, behind the closed doors of a special, climate-controlled room. Now, thanks to the transistor and the silicon chip, the computer has been reduced so dramatically in both bulk and price that it is accessible to millions. In 1982 a cascade of computers beeped and blipped their way into the American office, the American school, the American home. The "information revolution" that futurists have long predicted has arrived, bringing with it the promise of dramatic changes in the way people live and work, perhaps even in the way they think. America will never be the same. In a larger perspective, the entire world will never be the same. The industrialized nations of the West are already scrambling to computerize (1982 sales: 435,000 in Japan, 392,000 in Western Europe). The effect of the machines on the Third World is more uncertain. Some experts argue that computers will, if anything, widen gap between haves and have-nots. But the prophets of high technology believe the computer is so cheap and so powerful that it could enable under- developed nations to bypass the whole industrial revolution. While robot factories could fill the need for manufactured goods, the microprocessor would create myriad new industries, and an international computer network could bring important agricultural and medical information to even the most remote villages. "What networks of railroads, highways and canals were in another age, networks of telecommunications, information and computerization...are today," says Austrian Chancellor Bruno Kreisky. Says French Editor Jean-Jacques Servan-Schreiber, who believes that the computer's teaching capability can conquer the Third World's illiteracy and even its tradition of high birth rates: "It is the source of new life that has been delivered to us." The year 1982 was filled with notable events around the globe. It was a year in which death finally pried loose Leonid Brezhnev's frozen grip on the Soviet Union, and Yuri Andropov, the cold-eyed ex-chief of the KGB, took command. It was a year in which Israel's truculent Prime Minister Menachem Begin completely redrew the power map of the Middle East by invading neighboring Lebanon and smashing the Palestinian guerrilla forces there. The military campaign was a success, but all the world looked with dismay at the thunder of Israeli bombs on Beirut's civilians and at the massacres in the Palestinian refugee camps. It was a year in which Argentina tested the decline of European power by seizing the Falkland Islands, only to see Britain, led by doughty Margaret Thatcher, meet the test by taking them back again. Nor did all of the year's major news derive from wars or the threat of international violence. Even as Ronald Reagan cheered the sharpest decline in the U.S. inflation rate in ten years, 1982 brought the worse unemployment since the Great Depression (12 million jobless) as well as budget deficits that may reach an unprecedented $180 billion in fiscal 1982. High unemployment plagued Western Europe as well, and the multibillion-dollar debts of more than two dozen nations gave international financiers a severe fright. It was also a year in which the first artificial heart began pumping life inside a dying man's chest, a year in which millions cheered the birth of cherubic Prince William Arthur Philip Louis of Britain, and millions more rooted for a wrinkled, turtle-like figure struggling to find its way home to outer space. There are some occasions, though, when the most significant force in a year's news is not a single individual but a process, and a widespread recognition by a whole society that this process is changing the course of all other processes. That is why, after weighing the ebb and flow of events around the world, TIME has decided that 1982 is the year of the computer. It would have been possible to single out as Man of the Year one of the engineers or entrepreneurs who masterminded this technological revolution, but no one person has clearly dominated those turbulent events. More important, such a selection would obscure the main point. TIME's Man of the Year for 1982, the greatest influence for good or evil, is not a man at all. It is a machine: the computer. It is easy enough to look at the world around us and conclude that the computer has not changed things all that drastically. But one can conclude from similar observations that the earth is flat, and that the sun circles it every 24 hours. Although everything seems much the same from one day to the next, changes under the surface of life's routines are actually occurring it almost unimaginable speed. Just 100 years ago, parts of New York City were lighted for the first time by a strange new force called electricity; just 100 years ago, the German Engineer Gottlieb Daimler began building a gasoline-fueled internal combustion engine (three more years passed before he fitted it to a bicycle). So it is with the computer. The first fully electronic digital computer built in the U.S. dates back only to the end of World War II. Created at the University of Pennsylvania. ENIAC weighed 30 tons and contained 18,000 vacuum tubes, which failed at an average of one every seven minutes. The arrival of the transistor and miniaturized circuit in the 1950s made it possible to reduce a room-size computer to a silicon chip the size of a pea. And prices kept dropping. In contract to the $487,000 paid for ENIAC, a top IBM personal computer today costs about $4,000, and some discounters offer a basic Timex-Sinclair 1000 for $77.95. One computer expert illustrates the trend by estimating that if the automobile business had developed like the computer business, a Rolls-Royce would now cost $2.75 and run 3 million miles on a gallon of gas. Looking ahead, the computer industry sees pure gold. There are 83 million U.S. homes with TV sets, 54 million white-collar workers, 26 million professionals, 4 million small businesses. Computer salesmen are hungrily eyeing every one of them. Estimates for the number of personal computers in use by the end of the century run as high as 80 million. Then there are all the auxiliary industries: desks to hold computers, luggage to carry them, cleansers to polish them. "The surface is barely scratched," says Ulric Weil, an analyst for Morgan Stanley. Beyond the computer hardware lies the virtually limitless market for software, all those prerecorded programs that tell the willing but mindless computer what to do. These discs and cassettes range from John Wiley & Sons' investment analysis program for $59.95 (some run as high as $5,000) to Control Data's PLATO programs that teach Spanish or physics ($45 for the first lesson, $35 for succeeding ones) to a profusion of space wars, treasure hunts and other electronic games. This most visible aspect of the computer revolution, the video game, is its least significant. But even if the buzz and clang of the arcades is largely a teen-age fad, doomed to go the way of Rubik's Cube and the Hula Hoop, it is nonetheless a remarkable phenomenon. About 20 corporations are selling some 250 different game cassettes for roughly $2 billion this year. According to some estimates, more than half of all the personal computers bought for home use are devoted mainly to games. Computer enthusiasts argue that these games have educational value, by teaching logic, or vocabulary, or something. Some are even used for medical therapy. Probably the most important effect of these games, however, is that they have brought a form of the computer into millions of homes and convinced millions of people that it is both pleasant and easy to operate, what computer buffs call "user friendly." Games, says Philip D. Estridge, head of IBM's personal computer operations, "aid in the discovery process." Apart from games, the two things that the computer does best have wide implications but are quite basic. One is simply computation, manipulating thousands of numbers per second. The other is the ability to store, sort through and rapidly retrieve immense amounts of information. More than half of all employed Americans now earn their living not by producing things but as "knowledge workers," exchanging various kinds of information, and the personal computer stands ready to change how all of them do their jobs. Frank Herringer, a group vice president of Transamerica Corp., installed an Apple in his suburban home in Lafayette, Calif., and spent a weekend analyzing various proposals for Transamerica's $300 million takeover of the New York insurance brokerage firm of Fred S. James Co. Inc. "It allowed me to get a good feel for the critical numbers," says Herringer. "I could work through alternative options, and there were no leaks." Terry Howard, 44, used to have a long commute to his job at a San Francisco stock brokerage, where all his work involved computer data and telephoning. With a personal computer, he set up his own firm at home in San Rafael. Instead of rising at 6 a.m. to drive to the city, he runs five miles before settling down to work. Says he: "It didn't make sense to spend two hours of every day burning up gas, when my customers on the telephone don't care whether I'm sitting at home or in a high rise in San Francisco." John Watkins, safety director at Harriet & Henderson Yarns, in Henderson, N.C., is one of 20 key employees whom the company helped to buy home computers and paid to get trained this year. Watkins is trying to design a program that will record and analyze all mill accidents: who was injured, how, when, why. Says he: "I keep track of all the cases that are referred to a doctor, but for every doctor case, there are 25 times as many first-aid cases that should be recorded." Meantime, he has designed a math program for his son Brent and is shopping for a work-processing program to help his wife Mary Edith write her master's thesis in psychology. Says he: "I don't know what it can't do. It's like asking yourself, `What's the most exciting thing you've ever done?' Well, I don't know because I haven't done it yet." Aaron Brown, a former defensive end for the Kansas City Chiefs and now an office-furniture salesman in Minneapolis, was converted to the computer by his son Sean, 15, who was converted at a summer course in computer math. "I thought of computers very much as toys," says Brown, "but Sean started telling me. `You could use a computer in your work.' I said, `Yeah, yeah, yeah.'" Three years ago, the family took a vote on whether to go to California for a vacation or to buy an Apple. The Apple won, 3 to 1, and to prove its value, Sean wrote his father a program that computes gross profits and commissions on any sale. Brown started with "simple things," like filing the names and telephone numbers of potential customers. "Say I was going to a particular area of the city," Brown says. "I would ask the computer to pull up the accounts in a certain zip-code area, or if I wanted all the customers who were interested in whole office systems, I could pull that up too." The payoff: since he started using the computer, he has doubled his annual sales to more than $1 million. Brown has spent about $1,500 on software, all bound in vinyl notebooks along a wall of his home in Golden Valley, Minn., but Sean still does a lot of programming on his won. He likes to demonstrate one that he designed to teach French. "Vive la France!" it says, and then starts beeping the first notes of La Marseillaise. His mother Reatha uses the computer to help her manage a gourmet cookware store, and even Sister Terri, who originally cast the family's lone vote against the computer, uses it to store her high school class notes. Says Brown: "It's become kind of like the bathroom. Is someone is using it, you wait your turn." Reatha Brown has been lobbying for a new carpet, but she is becoming resigned to the prospect that the family will acquire a new hard-disc drive instead. "The video-cassette recorder," she sighs, pointing across the room, "that was my other carpet." Replies her husband, setting forth an argument that is likely to be replayed in millions of household in the years just ahead: "We make money with the computer, but all we can do with a new carpet is walk on it. Somebody once said there were five reasons to spend money: on necessities, on investments, on self-improvement, on memories and to impress your friends. The carpet falls in that last category, but the computer falls in all five." By itself, the personal computer is a machine with formidable capabilities for tabulating, modeling or recording. Those capabilities can be multiplied almost indefinitely by plugging it into a network of other computers. This is generally done by attaching a desk-top model to a telephone line (two-way cables and earth satellites are coming increasingly into use). One can then dial an electronic data base, which not only provides all manner of information but also collects and transmits messages: electronic mail. The 1,450 data bases that now exist in the U.S. range from general information services like the Source, a Reader's Digest subsidiary in McLean, Va., which can provide stock prices, airline schedules or movie reviews, to more specialized services like the American Medical Association's AMA/NET, to real esoterica like the Hughes Rotary Rig Report. Fees vary from $300 an hour to less than $10. Just as the term personal computer can apply to both a home machine and an office machine (and indeed blurs the distinction between the two places) many of the first enthusiastic users of these devices have been people who do much of their work at home: doctors, lawyers, small businessmen, writers, engineers. Such people also have special needs for the networks of specialized data. Orthopedic Surgeon Jon Love, of Madisonville, Ky., connects the Apple in his home to both the AMA/NET, which offers, among other things, information on 1,500 different drugs, and Medline, a compendium of all medical articles published in the U.S. "One day I accessed the computer three times in twelve minutes," he says. "I needed information on arthritis and cancer in the leg. It saved me an hour and a half of reading time. I want it to pay me back every time I sit down at it." Charles Manly III practices law in Grinnell, Iowa (pop. 8,700) a town without a law library, so he pays $425 a month to connect his CPT work processor to Westlaw, a legal data base in St. Paul. Just now he needs precedents in an auto insurance case. He dials the Westlaw telephone number, identifies himself by code, then types: "Courts (Iowa) underinsurance." The computer promptly tells him there is only one such Iowa case, and it is 14 years old. Manly asks for a check on other Midwestern states, and it gives him a long list of precedents in Michigan and Minnesota. I'm not a chiphead," he says, "but if you don't keep up with the new developments, even in a rural general practice, you're not going to have the competitive edge." The personal computer and its networks are even changing that oldest of all home businesses, the family farm. Though only about 3% of commercial farmers and ranchers now have computers, that number is expected to rise to nearly 20% within the next five years. One who has grasped the true faith is Bob Johnson, who helps run his family's 2,800-acre pig farm near De Kalb, Ill. Outside, the winter's first snowflakes have dusted the low-slung roofs of the six red-and-white barns and the brown fields specked with corn stubble. Inside the two- room office building, Johnson slips a disc into his computer and types "D" (for dial) and a telephone number. He is immediately connected to the Illinois farm bureau's newly computerized AgriVisor service. It not only gives him weather conditions to the west and the latest hog prices on the Chicago commodities exchange, but also offers advice. Should farmers continue to postpone the sale of their newly harvested corn? "Remember," the computer counsels, "that holding on for a dime or a nickel may not be worth the long-term wait." Johnson started out playing computer games on an Apple II, but then "those got shoved in the file cabinet." He began computerizing all his farm records, which was not easy. "We could keep track of the hogs we sold in dollars, but we couldn't keep track of them by pounds and numbers at the same time." He started shopping around and finally acquired a $12,000 combination at a shop in Lafayette, Ind.: a microcomputer from California Computer Systems, a video screen from Ampex, a Diablo would printer and an array of agricultural programs. Johnson's computer now knows the yields on 35 test plots of corn, the breeding records of his 300 sows, how much feed his hogs have eaten (2,787,260 lbs.) and at what cost ($166,047.73). "This way, you can charge your hogs the cost of the feed when you sell them and figure out if you're making any money," says Johnson. "We never had this kind of information before. It would have taken too long to calculate. But we knew we needed it." Just as the computer is changing the way work is done in home offices, so it is revolutionizing the office. Routine tasks like managing payrolls and checking inventories have long since been turned over to computers, but now the typewriter is giving way to the work processor, and every office thus becomes part of a network. This change has barely begun: about 10% of the typewriters in the 500 largest industrial corporations have so far been replaced. But the economic imperatives are inescapable. All told, office professionals could save about 15% of their time if they used the technology now available, says a study by Booz, Allen & Hamilton, and that technology is constantly improving. In one survey of corporations, 55% said they were planning to acquire the latest equipment. This technology involves not just word processors but computerized electronic message systems that could eventually make paper obsolete, and wall-size, two-way TV teleconference screens that will obviate traveling to meetings. The standard home computer is sold only to somebody who wants one, but the same machine can seem menacing when it appears in an office. Secretaries are often suspicious of new equipment, particularly if it appears to threaten their jobs, and so are executives. Some senior officials resist using a keyboard on the ground that such work is demeaning. Two executives in a large firm reportedly refuse to read any computer print-out until their secretaries have retyped it into the form of a standard memo. "The biggest problem is introducing computers into an office is management itself," says Ted Stout of National Systems Inc., an office design firm in Atlanta. "They don't understand it, and they are scared to death of it." But there is an opposite fear that drives anxious executives toward the machines: the worry that younger and more sophisticated rivals will push ahead of them. "All you have to do," says Alexander Horniman, an industrial psychologist at the University of Virginia's Darden School of Business, "is walk down the hall and see people using the computer and imagine they have access to all sorts of information you don't." Argues Harold Todd, executive vice president at First Atlanta Bank: "Managers who do not have the ability to use a terminal within three to five years may become organizationally dysfunctional." That is to say, useless. If more and more offices do most of their work on computers, and if a personal computer can be put in a living room, why should anyone have to go to work in an office at all? The question can bring a stab of hope to anybody who spends hours every day on the San Diego Freeway or the Long Island Rail Road. Nor is "telecommuting" as unrealistic as it sounds. Futurist Jack Nilles of the University of Southern California has estimated that many home computer would soon pay for itself from savings in commuting expenses and in city office rentals. Is the great megalopolis, the marketplace of information, about to be doomed by the new technology? Another futurist, Alvin Toffler, suggests at least a trend in that direction. In his 1980 book, The Third Wave, he portrays a 21st century world in which the computer revolution has canceled out many of the fundamental changes wrought by the Industrial Revolution: the centralization and standardization of work in the factory, the office, the assembly line. These changes may seem eternal, but they are less than two centuries old. Instead, Toffler imagines a revived version of pre-industrial life in what he has named "the electronic cottage," a utopian abode where all members of the family work, learn and enjoy their leisure around the electronic hearth, the computer. Says Vice President Louis H. Mertes of the Continental Illinois Bank and Trust Co. of Chicago, who is such a computer enthusiast that he allows no paper to be seen in his office (though he does admit to keeping a few files in the drawer of an end table): "We're talking when--not if--the electronic cottage will emerge." Continental Illinois has experimented with such electronic cottages by providing half a dozen workers with word processors so they could stay at home. Control Data tried a similar experiment and ran into a problem: some of its 50 "alternate site workers" felt isolated, deprived of their social life around the water cooler. The company decided to ask them to the office for lunch and meetings every week. "People are like ants, they're communal creatures," say Dean Scheff, chairman and founder of CPT Corp., a word-processing firm near Minneapolis. "They need to interact to get the creative juices flowing. Very few of us are hermits." TIME's Yankelovich poll underlines the point. Some 73% of the respondents believed that the computer revolution would enable more people to work at home. But only 31% said they would prefer to do so themselves. Most work no longer involves a hayfield, a coal mine or a sweatshop, but a field for social intercourse. Psychologist Abraham Maslow defined work as a hierarchy of functions: it first provides food and shelter, the basics, but then it offers security, friendship, "belongingness." This is not just a matter of trading gossip in the corridors; work itself, particularly in the information industries, requires the stimulation of personal contact in the exchange of ideas: sometimes organized conferences, sometimes simply what is called "the schmooze factor." Says Sociologist Robert Schrank: "The workplace performs the function of community." But is this a basic psychological reality or simply another rut dug by the Industrial Revolution? Put another way, why do so many people make friends at the office rather than among their neighbors? Prophets of the electronic cottage predict that it will once again enable people to find community where they once did: in their communities. Continental Illinois Bank, for one, has opened a suburban "satellite work station" that gets employees out of the house but not all the way downtown. Ford, Atlantic Richfield and Merrill Lynch have found that teleconferencing can reach far more people for far less money than traditional sales conferences. Whatever the obstacles, telecommuting seems particularly rich with promise for millions of women who feel tied to the home because of young children. Sarah Sue Hardinger has a son, 3, and a daughter three months old; the computer in her cream-colored stucco house in South Minneapolis is surrounded by children's books, laundry, a jar of Dippity Do. An experienced programmer at Control Data before she decided to have children, she now settles in at the computer right after breakfast, sometimes holding the baby in a sling. She starts by reading her computer mail, then sets to work converting a PLATO grammar program to a disc that will be compatible with Texas Instruments machines. "Mid-morning I have to start paying attention to the three- year-old, because he gets antsy," says Hardinger. "Then at 11:30 comes Sesame Street and Mr. Rogers, so that's when I usually get a whole lot done." When her husband, a building contractor, comes home and takes over the children, she returns to the computer. "I use part of my house time for work, part of my work time for the house," she says. "The baby has demand feeding, I have demand working." To the nation's 10 million physically handicapped, telecommuting encourages new hopes of earning a livelihood. A Chicago-area organization called Lift has taught computer programming to 50 people with such devastating afflictions as polio, cerebral palsy and spinal damage. Lift President Charles Schmidt cites a 46-year-old man paralyzed by polio: "He never held a job in his life until he entered our program three years ago, and now he's a programmer for Walgreens." Just as the vast powers of the personal computer can be vastly multiplied by plugging it into an information network, they can be extended in all directions by attaching the mechanical brain to sensors, mechanical arms and other robotic devices. Robots are already at work in a large variety of dull, dirty or dangerous jobs: painting automobiles on assembly lines and transporting containers of plutonium without being harmed by radiation. Because a computerized robot is so easy to reprogram, some experts foresee drastic changes in the way manufacturing work is done: toward customization, away from assembly- line standards. When the citizen of tomorrow wants a new suit, one futurist scenario suggests, his personal computer will take his measurements and pass them on to a robot that will cut his choice of cloth with a laser beam and provide him with a perfectly tailor garment. In the home too, computer enthusiasts delight in imagining machines performing the domestic chores. A little of that fantasy is already reality. New York City Real Estate Executive David Rose, for example, uses his Apple in business deals, to catalogue his 4,000 books and to write fund-raising letters to his Yale classmates. But he also uses it to wake him in the morning with soft music, turn on the TV, adjust the lights and make the coffee. In medicine, the computer, which started by keeping records and sending bills, now suggests diagnoses. CADUCEUS knows some 4,000 symptoms of more than 500 diseases: MYCIN specializes in infectious diseases: PUFF measures lung functions. All can be plugged into a master network called SUMEX-AIM, with headquarters at Standard in the West and Rutgers in the East. This may all sound like another step toward the disappearance of the friendly neighborhood G.P., but while it is possible that a family doctor would recognize 4,000 different symptoms. CADUCEUS is more likely to see patterns in what patients report and can then suggest a diagnosis. The process may sound dehumanized, but in one hospital where the computer specializes in peptic ulcers, a survey of patients showed that they found the machine "more friendly, polite, relaxing and comprehensible" than the average physician. The microcomputer is achieving dramatic effects on the ailing human body. These devices control the pacemakers implanted in victims of heart disease: they pump carefully measured quantities of insulin into the bodies of diabetics, they test blood samples for hundreds of different allergies; they translate sounds into vibrations that the deaf can "hear", they stimulate deadened muscles with electric impulses that may eventually enable the paralyzed to walk. In all the technologists' images of the future, however, there are elements of exaggeration and wishful thinking. Though the speed of change is extraordinary, so is the vastness of the landscape to be changed. New technologies have generally taken at least 20 years to establish themselves, which implied that a computer salesman's dream of a micro on every desk will not be fulfilled in the very near future. If ever. Certainly the personal computer is not without its flaws. As most new buyers soon learn, it is not that easy for a novice to use, particularly when the manuals contain instructions like this specimen from Apple: "This character prevents script from terminating the currently forming output line when it encounters the script command in the input stream." Another problem is that most personal computers end up costing considerable more than the ads imply. The $100 model does not really do very much, and the $1,000 version usually requires additional payments for the disc drive or the printer or the modem. Since there is very little standardization of parts among the dozens of new competitors, a buyer who has not done considerable homework is apt to find that the parts he needs do not fit the machine he bought. Software can be a major difficulty. The first computer buyers tended to be people who enjoyed playing with their machines and designing their own programs. But the more widely the computer spreads, the more it will have to be used by people who know no more about its inner workings than they do about the insides of their TV sets--and do not want to. They will depend entirely on the commercial programmers. Good programs are expensive both to make and to buy. Control Data has invested $900 million in its PLATO educational series and has not yet turned a profit, though its hopes run into the billions. A number of firms have marketed plenty of shoddy programs, but they are not cheap either. "Software is the new bandwagon, but only 20% of it is any good," say Diana Hestwood, a Minneapolis-based educational consultant. She inserts a math program and deliberately makes ten mistakes. The machine gives its illiterate verdict: "You taken ten guesses." Says Atari's chief scientist, Alan Kay: "Software is getting to be embarrassing." Many of the programs now being touted are hardly worth the cost, or hardly worth doing at all. Why should a computer be needed to balance a checkbook or to turn of the living-room lights? Or to recommend a dinner menu, particularly when it can consider (as did a $34 item called the Pizza Program) ice cream as an appetizer? Indeed, there are many people who may quite reasonably decide that they can get along very nicely without a computer. Even the most impressive information networks may provide the customer with nothing but a large telephone bill. "You cannot rely on being able to find what you want," says Atari's Kay. It's really more useful to go to a library." It is becoming increasingly evident that a fool assigned to work with a computer can conceal his own foolishness in the guise of high-tech authority. Lives there a single citizen who has not been commanded by a misguided computer to pay an income tax installment or department store bill that he has already paid? What is true for fools is no less true for criminals, who are now able to commit electronic larceny from the comfort of their living room. The probable champion is Stanley Mark Rifkin, a computer analyst in Los Angeles, who tricked the machines at the Security Pacific National Bank into giving him $10 million. While free on bail for that in 1979 (he was eventually sentenced to eight years), he was arrested for trying to steal $50 million from Union Bank (the charges were eventually dropped). According to Donn Parker, a specialist in computer abuse at SRI International (formerly the Stanford Research Institute), "Nobody seems to know exactly what computer crime is, how much of it there is, and whether it is increasing or decreasing. We do know that computers are changing the nature of business crime significantly." Even if all the technical and intellectual problems can be solved, there are major social problems inherent in the computer revolution. The most obvious is unemployment, since the basic purpose of commercial computerization is to get more work done by fewer people. One British study predicts that "automation-induced unemployment" in Western Europe could reach 16% in the next decade, but most analyses are more optimistic. The general rule seems to be that new technology eventually creates as many jobs as it destroys, and often more. "People who put in computers usually increase their staffs as well," says CPT's Scheff. "Of course," he adds, "one industry may kill another industry. That's tough on some people." Theoretically, all unemployed workers can be retrained, but retraining programs are not high on the nation's agenda. Many new jobs, moreover, will require an aptitude in using computers, and the retraining needed to use them will have to be repeated as the technology keeps improving. Says a chilling report by the Congressional Office of Technology Assessments: "Lifelong retraining is expected to become the norm for many people." There is already considerable evidence that the school children now being educated in the use of computers are generally the children of the white middle class. Young blacks, whose unemployment rate stands today at 50%, will find another barrier in front of them. Such social problems are not the fault of the computer, of course, but a consequence of the way the American society might use the computer. "Even in the days of the big mainframe computers, they were a machine for the few," says Katherine Davis Fishman, author of The Computer Establishment. "It was tool to help the rich get richer. It still is to a large extent. One of the great values of the personal computer is that smaller concerns, smaller organizations can now have some of the advantages of the bigger organizations." How society uses its computers depends greatly on what kind of computers are made and sold, and that depends, in turn, on an industry in a state of chaotic growth. Even the name of the product is a matter of debate: "microcomputer" sounds too technical, but "home computer" does not fit an office machine. "Desktop" sounds awkward, and "personal computer" is at best a compromise. Innovators are pushing off in different directions. Hewlett Packard is experimenting with machines that respond to vocal commands; Osborne is leading a rush toward portable computers, ideally no larger than a book. And for every innovator, there are at least five imitators selling copies. There is much talk of a coming shakeout, and California Consultant David E. Gold predicts that perhaps no more than a dozen vendors will survive the next five years. At the moment, Dataquest estimates that Texas Instruments leads the low-price parade with a 35% share of the market in computers selling for less than $1,000. Next come Timex (26%), Commodore (15%) and Atari (13%). In the race among machines priced between $1,000 and $5,000, Apple still commands 26% followed by IBM (17% and Tandy/Radio Shack (10%). But IBM, which has dominated the mainframe computer market for decades, is coming on very strong. Apple, fighting back, will unveil its new Lisa model in January, putting great emphasis on user friendliness. The user will be able to carry out many functions simply by pointing to a picture of what he wants done rather than typing instructions. IBM is also reported to be planning to introduce new machines in 1983, as are Osborne and others. Just across the horizon, as usual, lurk the Japanese. During the 1970s, U.S. computer manufacturers complacently felt that they were somehow immune from the Japanese combination of engineering and salesmanship that kept gnawing at U.S. auto, steel and appliance industries. One reason was that the Japanese were developing their large domestic market. When they belatedly entered the U.S. battlefield, they concentrated not on selling whole systems but on particular sectors--with dramatic results. In low-speed printers using what is known as the dot-matrix method, the Japanese had only a 6% share of the market in 1980; in 1982, they provided half the 500,000 such printers sold in the U.S. Says Computerland President Ed Faber: "About 75% of the dot-matrix printers we sell are Japanese, and almost all the monitors. There is no better quality electronics than what we see coming from Japan." Whatever its variations, there is an inevitability about the computerization of America. Commercial efficiency requires it, Big Government requires it, modern life requires it, and so it is coming to pass. But the essential element in this sense of inevitability is the way in which the young take to computers: not as just another obligation imposed by adult society but as a game, a pleasure, a tool, a system that fits naturally into their lives. Unlike anyone over 40, these children have grown up with TV screens; the computer is a screen that responds to them, hooked to a machine that can be programmed to respond the way they want it to. That is power. There are now more than 100,000 computers in U.S. schools, compared with 52,000 only 18 months ago. This is roughly one for every 400 pupils. The richer and more progressive states do better. Minnesota leads with one computer for every 50 children and a locally produced collection of 700 software programs. To spread this development more evenly and open new doors for business. Apple has offered to donate one computer to every public school in the U.S.--a total of 80,000 computers worth $200 million retail--if Washington will authorize a 25% tax write-off (as is done for donations of scientific equipment to colleges). Congress has so far failed to approve the idea, but California has agreed to a similar proposal. Many Americans concerned about the erosion of the schools put faith in the computer as a possible savior of their children's education, at school and at home. The Yankelovich poll showed that 57% thought personal computers would enable children to read and to do arithmetic better. Claims William Ridley, Control Data's vice president for education strategy: "If you want to improve youngsters one grade level in reading, our PLATO program with teacher supervision can do it up to four times faster and for 40% less expense than teachers alone." No less important than this kind of drill, which some critics compare with the old-fashioned flash cards, is the use of computers to teach children about computers. They like to learn programming, and they are good at it, often better than their teachers, even in the early grades. They treat it as play, a secret skill, unknown among many of their parents. They delight in cracking corporate security and filching financial secrets, inventing new games and playing them on military networks, inserting obscene jokes into other people's programs. In soberer versions that sort of skill will become a necessity in thousands of jobs opening up in the future. Beginning in 1986, Carnegie-Mellon University expects to require all of its students to have their own personal computers. "People are willing to spend a large amount of money to educate their children," says Author Fishman. "So they're all buying computers for Johnny to get a head start (though I have not heard anyone say, `I am buying a computer for Susie')." This transformation of the young raises a fundamental and sometimes menacing question: Will the computer change the very nature of human thought? And if so, for better or worse There has been much time wasted on the debate over whether computers can be made to think, as HAL seemed to be doing in 2001, when it murdered the astronauts who might challenge its command of the spaceflight. That answer is simple: computers do not think, but they do simulate many of the processes of the human brain: remembering, comparing, analyzing. And as people rely on the computer to do things that they used to do inside their heads, what happens to their heads? Will the computer's ability to do routine work mean that human thinking will shift to a higher level? Will IQs rise Will there be more intellectuals? The computer may make a lot of learning as unnecessary as memorizing the multiplication tables. But if a dictionary stored in the computer's memory can easily correct any spelling mistakes, what is the point of learning to spell? And if the mind is freed from intellectual routine, will it race off in pursuit of important ideas or lazily spend its time on more video games? Too little is known about how the mind works, and less about how the computer might change that process. The neurological researches of Mark Rosenzweig and his colleagues at Berkeley indicate that animals trained to learn and assimilate information develop heavier cerebral cortices, more glial cells and bigger nerve cells. But does the computer really stimulate the brain's activity or, by doing so much of its work, permit it to go slack? Some educators do believe they see the outlines for change. Seymour Papert, professor of mathematics and education at M.I.T. and author of Mindstorms: Children, Computers and Powerful Ideas, invented the computer language named Logo, with which children as young as six can program computers to design mathematical figures. Before they can do that, however, they must learn how to analyze a problem logically, step by step. "Getting a computer to do something," says Papert, "requires the underlying process to be described, on some level, with enough precision to be carried out by the machine." Charles P. Lecht, president of the New York consulting firm Lecht Scientific, argues that "what the lever was to the body, the computer system is to the mind." Says he: "Computers help teach kids to think. Beyond that, they motivate people to think. There is a great difference between intelligence and manipulative capacity. Computers help us to realize that difference." The argument that computers train minds to be logical makes some experts want to reach for the computer key that says ERASE. "The last thing you want to do is think more logically," says Atari's Kay. "The great think about computers is that they have no gravity systems. The logical system is one that you make up. Computers are a wonderful way of being bizarre." Sherry Turkle, a sociologist now finishing a book titled The Intimate Machine: Social and Cultural Studies of Computers and People, sees the prospect of change in terms of perceptions and feelings. Says she: "Children define what's special about people by contrasting them with their nearest neighbors, which have always been the animals. People are special because they know how to think. Now children who work with computers see the computer as their nearest neighbor, so they see that people are special because they feel. This may become much more central to the way people think about themselves. We may be moving toward a re-evaluation of what makes us human." For all such prophecies, M.I.T. Computer Professor Joseph Weizenbaum has answers ranging from disapproval to scorn. He has insisted that "giving children computers to play with...cannot touch...any real problem," and he has described the new computer generation as "bright young men of disheveled appearance [playing out] megalomaniacal fantasies of omnipotence." Weizenbaum's basic objection to the computer enthusiasts is that they have no sense of limits. Says he: "The assertion that all human knowledge is encodable in streams of zeros and ones--philosophically, that's very hard to swallow. In effect, the whole world is made to seem computable. This generates a kind of tunnel vision, where the only problems that seem legitimate are problems that can be put on a computer. There is a whole world of real problems, of human problems, which is essentially ignored." So the revolution has begun, and as usually happens with revolutions, nobody can agree on where it is going or how it will end. Nils Nilsson, director of the Artificial Intelligence Center at SRI International, believes the personal computer, like television, can "greatly increase the forces of both good and evil." Marvin Minsky, another of M.I.T.'s computer experts, believes the key significance of the personal computer is not the establishment of an intellectual ruling class, as some fear, but rather a kind of democratization of the new technology. Says he: "The desktop revolution has brought the tools that only professionals have had into the hands of the public. God knows what will happen now." Perhaps the revolution will fulfill itself only when people no longer see anything unusual in the brave New World, when they see their computer not as a fearsome challenger to their intelligence but as a useful linkup of some everyday gadgets: the calculator, the TV and the typewriter. Or as Osborne's Adam Osborne puts it: "The future lies in designing and selling computers that people don't realize are computers at all."
[ Read More ]

Guide to identifying your Processor Technology

This guide is intended to help you identify the major Processor Technologies and their corresponding Processor Value Units (PVUs) per processor core. It helps clarify market inconsistencies across the major server brands when defining a "processor" or "N-Way". Review the definitions and processor identification steps below and use the links to the right to view a list of Processor Technologies by Server Brand. Learn more about Definitions Processor Technology identification steps Example Definitions IBM™ Software Group defines a processor as a core IBM licenses Processor Value Units per processor core Hardware vendors differ when defining the terms Processor and N-Way.
[ Read More ]

10 Programming Languages You Should Learn Right Now

Among thousands, 10 programming languages stand out for their job marketability and wide use. If you're looking to boost your career or learn something new, start here. Knowing a handful of programming languages is seen by many as a harbor in a job market storm, solid skills that will be marketable as long as the languages are. Yet, there is beauty in numbers. While there may be developers who have had riches heaped on them by knowing the right programming language at the right time in the right place, most longtime coders will tell you that periodically learning a new language is an essential part of being a good and successful Web developer. "One of my mentors once told me that a programming language is just a programming language. It doesnt matter if youre a good programmer, its the syntax that matters," Tim Huckaby, CEO of San Diego-based software engineering company CEO Interknowlogy.com, told eWEEK. However, Huckaby said that while his company is "swimming" in work, hes having a nearly impossible time finding recruits, even on the entry level, that know specific programming languages. "Were hiring like crazy, but were not having an easy time. Were just looking for attitude and aptitude, kids right out of school that know .Net, or even Java, because with that we can train them on .Net," said Huckaby. "Dont get fixated on one or two languages. When I started in 1969, FORTRAN, COBOL and S/360 Assembler were the big tickets. Today, Java, C and Visual Basic are. In 10 years time, some new set of languages will be the in thing. …At last count, I knew/have learned over 24 different languages in over 30 years," Wayne Duqaine, director of Software Development at Grandview Systems, of Sebastopol, Calif., told eWEEK. By picking the brains of Web developers and IT recruiters, eWEEK selected 10 programming languages that are a bonus for developers to add to their resumes. Even better, theyre great jumping-off points, with loads of job opportunities for younger recruits. 1. PHP What it is: An open-source, interpretive, server-side, cross-platform, HTML scripting language, especially well-suited for Web development as it can be embedded into HTML pages. Why you should learn it: Its particularly widely used. "High-speed scripting with caching, augmented with compiled code plug-ins (such as can be done with Perl and PHP) is where the future is. Building Web apps from scratch using C or COBOL is going the way of the dinosaur," said Duquaine. Job availabilities: 1,152* 2. C# What it is: A general-purpose, compiled, object-oriented programming language developed by Microsoft as part of its .NET initiative, it evolved from C and C++ Why you should learn it: Its an essential part of the .Net framework. "Learning C#, which is just Java with a different name plate, is critical if you heavily use Microsoft," said Duquaine. Job availabilities: 5,111 3. AJAX (Asynchronous JavaScript and XML) What it is: Though technically not a programming language, AJAX uses XHTML or HTML, JavaScript and XML to create interactive Web applications. Why you should learn it: Ever since Google Maps put AJAX, well, on the map, the requests for AJAX-knowledgeable pros went through the roof. "The demand for AJAX knowledge is huge because its so damned hard to learn," said Huckaby. Of note, Microsoft announced recently plans to release a tool named Atlas that will make AJAX easier to implement. "If Microsofts Atlas tool is successful, it would bring the extreme complexity and annoyance of AJAX to the average worker," said Huckaby. Job availabilities : 1,106 4. JavaScript What it is: Not to be confused with Java, JavaScript is a an object-oriented, scripting programming language that runs in the Web browser on the client side. Its smaller than Java, with a simplified set of commands, easier to code and doesnt have to be compiled. Why you should learn it: Embedded into HTML, its used in millions of Web pages to validate forms, create cookies, detect browsers and improve the design. With its simplicity to learn as well as wide use, its considered a great bang for your educational buck. Job availabilities: 4,406 5. Perl What it is: Perl is an open-source, cross-platform, server-side interpretive programming language used extensively to process text through CGI programs. Why you should learn it: Perls power in processing of piles of text has made it very popular and widely used to write Web server programs for a range of tasks. "Learning some form of scripting language, such as Perl or PHP is critical if you are doing Web apps," said Duquaine. Job availabilities: 4,810 6. C What it is: A standardized, general-purpose programming language, its one of the most pervasive languages and the basis for several others (such as C++). Why you should learn it: "Learning C is crucial. Once you learn C, making the jump to Java or C# is fairly easy, because a lot of the syntax is common. Also, a lot of C syntax is used in scripting languages," said Duquaine. Job availabilities: 6,164, including all derivatives 7. Ruby and Ruby on Rails What they are: Ruby is a dynamic, object-oriented, open-source programming language; Ruby on Rails is an open-source Web application framework written in Ruby that closely follows the MVC (Model-View-Controller) architecture. Why you should learn it: With a focus on simplicity, productivity and letting the computers do the work, in a few years, its usage has spread quickly. As a bonus, many find it easy to learn. Job availabilities : 210 and 54, respectively 8. Java What it is: An object-oriented programming language developed by James Gosling and colleagues at Sun Microsystems in the early 1990s. Why you should learn it: Hailed by many developers as a "beautiful" language, it is central to the non-.Net programming experience. "Learning Java is critical if you are non-Microsoft," said Duquaine. Job availabilities: 14,408 9. Python What it is: An interpreted, dynamically object-oriented, open-source programming language that utilizes automatic memory management. Why you should learn it: Designed to be a highly readable, minimalist language, many say it has a sense of humor (spam and eggs, rather than foo and bar), Python is used extensively by Google as well as in academia because of its syntactic simplicity. Job availabilities: 811 10. VB.Net (Visual Basic .Net) What it is: An object-oriented language implemented on Microsofts .Net framework. Why you should learn it: Most argue that VB.Net is currently more popular than ever and one of the only "must-learns." "It is currently dominating in adoption and that is where all the work is," said Huckaby. Job availabilities: 2,090 * All numbers on job availability were pulled from nationwide queries on Dice.com, a job site for technology professionals.
[ Read More ]

How to Learn a Programming Language

Decide what you want to do. Some programming applications with strong Web presence and good materials for beginners are game programming, Web site creation, automation of common tasks ("scripting"), text processing, and scientific problem solving. If you just think programming would be cool to learn and don't have any specific applications in mind, that's okay, but thinking about what you want to program in advance will help you make informed decisions during your learning experience. Also remember that programming can be a frustrating job if you don't pay proper attention or make too many mistakes while writing code. Ads by Google Full JavaScript Stack Program your entire application with JavaScript from end to end. wakanda.org 2Choose a programming language. When you first begin to learn, choose an easy-to-learn, high level language such as Python. Later, you may move on to a lower level language such as C or C++ to better understand how exactly programs run and interact. Perl and Java are other popular languages for beginners. Research your target application to learn if there are languages you should definitely know (e.g. SQL for databases) or avoid. Don't be confused by jargon like "object-oriented", "concurrent", or "dynamic"; these all mean things, but you won't be able to understand them until you actually have some programming experience. 3Find learning resources. Search the Web for good places to start on the languages mentioned above, and be sure to check the language's home page (if it has one) for an official guide or handbook. Also, find someone who already knows how to program. Online tutorials are nice, but they can be frustrating at times if you can't get answers to specific questions. Sometimes library and videos help a lot. 4Start small. You can't expect to write a bestselling 700-page masterpiece if you have no practical writing experience; programming is the same way. Start with basic constructs and write small programs (10 to 30 lines) to test your understanding of the concepts. Stretch yourself, but don't try to run before you can walk. 5Put in the time. It takes many hours of practicing problem-solving skills on different types of problems before you can call yourself an expert. Project Euler has many small programming assignments, ranked roughly by difficulty, that are useful for honing your skills and keeping in practice. Also learn making flowcharts. 6Keep at it. Programming can be very frustrating, but successfully completing a program can be intensely satisfying and pleasing. Don't give up if you don't understand a concept; programming can be a very abstract thing to learn. When working on a particularly intricate problem, take periodic breaks to let your brain relax and relegate the problem to your subconscious mind. Make a good schedule for working. 7Keep learning. Knowing one programming language is good, but knowing four or five is better. Regardless of what language you use most often, having knowledge of others to draw on will make you a better programmer and better able to understand common constructs and problems in the abstract. So learn several programming languages, especially two or three with different design philosophies, such as Lisp, Java, and Perl. But learn each of them properly
[ Read More ]

Top 10 Enterprise Database Systems to Consider

How far back does your knowledge of databases go -- late-1980s, mid-1990s, five years ago? If so, you might not recognize some of the old timers in this list. You'll also do a double take if you didn't know many of them have their roots in the mid-to-late 1970s. It would be hard to argue that the database market is not mature. It's also highly competitive, and enterprise database systems come packed with features from hot backups to high-availability. These database systems range in price from free to tens of thousands of dollars. There's no single correct answer for every data problem. Nor is there a perfect database system; each has its own set of features and shortcomings. Visitors to this story also viewed these sponsored articles: How to Build a Private Cloud: Real World Lessons Implementing a private cloud allows you to enjoy the scalability of the cloud coupled with the security of your on-premises infrastructure. But what's the best way to architect a solution that's right for your operation? Gaining Control of the Cloud Without Losing Flexibility Survey details performance gains and cost savings from the cloud. Here is a shortcut to the research you need to determine which solution is best for you. 1. Oracle Oracle began its journey in 1979 as the first commercially available relational database management system (RDBMS). Oracle's name is synonymous with enterprise database systems, unbreakable data delivery and fierce corporate competition from CEO Larry Ellison. Powerful but complex database solutions are the mainstay of this Fortune 500 company (currently 105th but 27th in terms of profitability). 2. SQL Server Say what you will about Microsoft and its interesting collection of officers. It's profitability exceeds all other tech companies, and SQL Server helped put it there. Sure, Microsoft's desktop operating system is everywhere, but if you're running a Microsoft Server, you're likely running SQL Server on it. SQL Server's ease of use, availability and tight Windows operating system integration makes it an easy choice for firms that choose Microsoft products for their enterprises. Currently, Microsoft touts SQL Server 2008 as the platform for business intelligence solutions. 3. DB2 Big Blue puts the big into data centers with DB2. DB2 runs on Linux, UNIX, Windows and mainframes. IBM pits its DB2 9.7 system squarely in competition with Oracle's 11g, via the International Technology Group, and shows significant cost savings for those that migrate to DB2 from Oracle. How significant? How does 34 percent to 39 percent for comparative installations over a three-year period sound? 4. Sybase Sybase is still a major force in the enterprise market after 25 years of success and improvements to its Adaptive Server Enterprise product. Although its market share dwindled for a few years, it's returning with powerful positioning in the next-generation transaction processing space. Sybase has also thrown a considerable amount of weight behind the mobile enterprise by delivering partnered solutions to the mobile device market. 5. MySQL MySQL began as a niche database system for developers but grew into a major contender in the enterprise database market. Sold to Sun Microsystems in 2008, MySQL is currently part of the Oracle empire (January 2010). More than just a niche database now, MySQL powers commercial websites by the hundreds of thousands and a huge number of internal enterprise applications. Although MySQL's community and commercial adopters had reservations about Oracle's ownership of this popular open source product, Oracle has publicly declared its commitment to ongoing development and support. 6. PostgreSQL PostgreSQL, the world's most advanced open source database, hides in such interesting places as online gaming applications, data center automation suites and domain registries. It also enjoys some high-profile duties at Skype, Yahoo! and MySpace. PostgreSQL is in so many strange and obscure places that it might deserve the moniker, "Best Kept Enterprise Database Secret." Version 9.0, currently in beta, will arrive for general consumption later this year. 7. Teradata Have you ever heard of Teradata? If you've built a large data warehouse in your enterprise, you probably have. As early as the late 1970s, Teradata laid the groundwork for the first data warehouse -- before the term existed. It created the first terabyte database for Wal-Mart in 1992. Since that time, data warehousing experts almost always say Teradata in the same sentence as enterprise data warehouse. 8. Informix Another IBM product in the list brings you to Informix. IBM offers several Informix versions -- from its limited Developer Edition, to its entry-level Express Edition, to a low-maintenance online transaction processing (OLTP) Workgroup Edition all the way up to its high-performance OLTP Enterprise Edition. Often associated with universities and colleges, Informix made the leap to the corporate world to take a No. 1 spot in customer satisfaction. Informix customers often speak of its low cost, low maintenance and high reliability. 9. Ingres Ingres is the parent open source project of PostgreSQL and other database systems, and it is still around to brag about it. Ingres is all about choice and choosing might mean lowering your total cost of ownership for an enterprise database system. Other than an attractive pricing structure, Ingres prides itself on its ability to ease your transition from costlier database systems. Ingres also incorporates security features required for HIPPA and Sarbanes Oxley compliance. 10. Amazon's SimpleDB Databases and Amazon.com seem worlds apart, but they aren't. Amazon's SimpleDB offers enterprises a simple, flexible and inexpensive alternative to traditional database systems. SimpleDB boasts low maintenance, scalability, speed and Amazon services integration. As part of Amazon's EC2 offering, you can get started with SimpleDB for free. Ken Hess is a freelance writer who writes on a variety of open source topics including Linux, databases, and virtualization. He is also the coauthor of Practical Virtualization Solutions, which is scheduled for publication in October 2009. You may reach him through his web site at http://www.kenhess.com.
[ Read More ]

Top 5 Best Databases

As part of the contest we conducted recently, we got 160+ comments from the geeky readers who choose their favorite database. Based on this data, the top spot goes to.. drum roll please.. MySQL If you are new to any of the top 5 database mentioned here, please read the rest of the article to understand more about them. 1. MySQL MySQL is used in almost all the open source web projects that require a database in the back-end. MySQL is part of the powerful LAMP stack along with Linux, Apache and PHP. This was originally created by a company called MySQL AB, which was acquired by Sun Microsystems, which was acquired by Oracle. Since we don’t what Oracle will do with MySQL database, open source community has created several forks of MySQL including Drizzle and MariaDB. Following are few key features: Written in C and C++. MyISAM storage uses b-tree disk tables with index compression for high performance. Support for partitioning and replication. Support for Xpath, full text search. Support for stored procedures, triggers, views etc., Additional Information: Home Page: http://www.mysql.com/ Stable release: 5.1.43 License: GNU Read more about MySQL , Drizzle and MariaDB on Wikipedia 2. PostgreSQL PotgreSQL is a open source object-relational database system. It runs on most *nix flavours, Windows and Mac OS. This has full support for joins, views, triggers, stored procedures etc., Following are few key features: MVCC – Multi-Version Concurrency Control Hot backups and point-in-time recovery Support for tablespaces Asynchronous replication Highly scalable Additional Information: Home page: http://www.postgresql.org/ Stable release: 8.4.2 License: Open source MIT License Read more about PostgreSQL at wikipedia 3. Oracle Oracle is the best database for any mission critical commercial application. Oracle has following four different editions of the database: 1) Enterprise Edition 2) Standard Edition 3) Standard Edition One 4) Express Edition Following are few key features of the oracle database. Real Application Cluster (RAC) Data Guard for standby database Virtual Private Database Automatic Memory, Storage and Undo Management OLAP, Partitioning, Data Mining Advance Queuing, XML DB, Support for Spatial data Flashback Database, Query, Table and Transaction Additional Information: Home Page: http://www.oracle.com/us/products/database/index.htm Stable release: 11g R2 License: Proprietary Read more about Oracle at wikipedia 4. SQLite SQLite does not work like a traditional client-server model with standalone process. Instead, it is a self-contained, server-less SQL database engine. Main Features of SQLite: Zero configuration with no setup or admin tasks. Complete database is stored in a single disk file. No external dependencies Supports database of several TB in size Work on most *nix flavors, Mac OS X, windows. It’s also cross-platform. WinCE is supported out-of-the box Additional Information: Home Page: http://sqlite.org/index.html Developed by: D. Richard Hipp Stable release: 3.6.22 License: Public domain Read more about SQLite at wikipedia 5. Microsoft SQL Server This is Microsoft’s flagship Database product. If you are stuck in a company that heavily uses Microsoft products, you might end-up working on MS SQL Server. Home Page: http://www.microsoft.com/sqlserver/2008/en/us/default.aspx License: Microsoft EULA Stable release: SQL Server 2008 Written in C, C++ Read more about MS SQL Server at wikipedia
[ Read More ]

HTML Introduction

My First Heading

My first paragraph.

Try it yourself » Example Explained The DOCTYPE declaration defines the document type The text between and describes the web page The text between and is the visible page content The text between

and

is displayed as a heading The text between

and

is displayed as a paragraph The declaration is the doctype for HTML5. What is HTML? HTML is a language for describing web pages. HTML stands for Hyper Text Markup Language HTML is a markup language A markup language is a set of markup tags The tags describes document content HTML documents contain HTML tags and plain text HTML documents are also called web pages HTML Tags HTML markup tags are usually called HTML tags HTML tags are keywords (tag names) surrounded by angle brackets like HTML tags normally come in pairs like and The first tag in a pair is the start tag, the second tag is the end tag The end tag is written like the start tag, with a forward slash before the tag name Start and end tags are also called opening tags and closing tags content HTML Elements "HTML tags" and "HTML elements" are often used to describe the same thing. But strictly speaking, an HTML element is everything between the start tag and the end tag, including the tags: HTML Element:

This is a paragraph.

Web Browsers The purpose of a web browser (such as Google Chrome, Internet Explorer, Firefox, Safari) is to read HTML documents and display them as web pages. The browser does not display the HTML tags, but uses the tags to interpret the content of the page: HTML Page Structure Below is a visualization of an HTML page structure:

This a Heading

This is a paragraph.

This is another paragraph.

HTML Versions Since the early days of the web, there have been many versions of HTML: Version Year HTML 1991 HTML+ 1993 HTML 2.0 1995 HTML 3.2 1997 HTML 4.01 1999 XHTML 1.0 2000 HTML5 2012 XHTML5 2013 The Declaration The declaration helps the browser to display a web page correctly. There are many different documents on the web, and a browser can only display an HTML page 100% correctly if it knows the HTML type and version used. Common Declarations HTML5 HTML 4.01 XHTML 1.0
[ Read More ]

About Delphi Programming - for Novice Developers and First Time Visitors

Hi! I'm Zarko Gajic, your About.com Guide to Delphi Programming. That's my picture at the top of the page (or maybe at the bottom). You can read my bio to learn more about who I am. I write feature articles and tutorials related to Delphi programming. I also gather links to other sites that have articles, tutorials, and important information on specific aspects of programming in the Delphi language. The purpose of this page is to orient newcomers with an overview of some special features of the About Delphi Programming Web site. Before you start exploring the vast amount of Delphi related topics this site covers, I would strongly suggest signing up for the FREE Delphi Newsletter (no spam 100%) - to make sure you are up-to-date with new Delphi tutorials, articles and tips posted daily on About Delphi Programming. Embarcadero Technologies Delphi is an object-oriented, visual programming environment to develop 32 and 64 bit applications; with FireMonkey, Delphi is the fastest way to deliver ultra-rich and visually stunning native applications for Windows, Mac and iOS. If you are just entering the programming world, here's why you should consider learning Delphi: Why Delphi?. Also, don't miss Delphi History! If you are confused about different Delphi versions (Delphi Starter, Delphi XE2, RAD Studio), read the "Flavors of Delphi" article to easily pick your Delphi of choice. There is a lot of information on this site about Delphi programming; this site covers all aspects of Delphi development, including tutorials and articles, forum, language reference with examples, glossary, free code programs, custom components and much more. Let me help you find what you're looking for (and help your career by looking for the right Delphi job). Learn how Delphi can help you solve complex development problems to deliver high-performance, highly scalable applications ranging from Windows and database applications to mobile and distributed applications for the Internet. If you simply want to build a simple database application (accounting, CD/DVD album), for home use, Delphi will help you build it fast and with ease. Looking for something specific? You can search this Delphi Programming site or all of About.com for a specific programming task. Try it using the search box at the top of the page. Hint: Put phrases in double-quotation marks for better results (i.e. "protected hack"). If you are looking for more ways to find Delphi programming related materials, go see the "Searching for Delphi" article. True Beginners, Students, Newcomers ... For those who are new to Delphi, I've prepared several free online courses designed to get you to a fast start. The free courses below are perfect for Delphi beginners as well as for those who want a broad overview of the art of programming with Delphi. Turbo Delphi Tutorial: For Novice and Non-Programmers A Beginner's Guide to Delphi Programming A Beginner's Guide to Delphi Database Programming A Beginner's Guide to ASP.NET Web programming for Delphi developers Be sure not to miss the Delphi Tutorials and Online / Email Courses section. How to program in Delphi – what you need to know? This entire site is devoted to providing the tutorials and other resources needed to learn Delphi programming. There are several broad categories of Delphi programming tutorials to help you in your quest to learn how to create the best solutions fast. These include tutorials for the beginner as well as the more experienced developer, find them listed in A Beginner's Guide to Delphi [enter Delphi topic]. If you are looking for free or/and shareware and commercial components, you’ll be happy to know I’ve prepared a dozen of Top Picks pages – where all the best third-party components, tools and Delphi books are collected and reviewed. Be the first to know! Sign up for your Delphi Programming Newslette Grab the Delphi Programming Headlines RSS Feed Follow About Delphi Programming on Facebook Readers Respond: What Was Your First (Not a "Hello World" Type) Delphi Application? Read responses (63) Your First Delphi Project Suggested Reading Essentials Language Reference Free code App's and VCL's New posts to the Delphi Programming forums: best way to get provider name form udl TWinSocketStream - Write and Read delphi 6 and .net programming Related Articles About Delphi Programming and About.com on Borland Developer's Day / INF... Delphi Tutorials: A Beginner's Guide to Delphi (Free Online Delphi Tuto... Sophisticated Delphi Pascal techniques for Beginners A Beginner’s Guide to Delphi Programming - Chapter 7 How To Index - Delphi Programming
[ Read More ]