Showing posts with label Things You Must Know. Show all posts
Showing posts with label Things You Must Know. Show all posts

Tuesday, August 3, 2010

Is your Nokia Cell Phone Original

Nokia is one of the largest selling phones across the globe. Most of us own a Nokia phone but are unaware of it’s originality. Are you keen to know whether your Nokia mobile phone is original or not? Then you are in the right place and this information is specially meant for you. Your phones IMEI (International Mobile Equipment Identity) number confirms your phone’s originality.
Press the following on your mobile *#06# to see your Phone’s IMEInumber(serial number).
Then check the 7th and 8th numbers
Phone serial no. x x x x x x ? ? x x x x x x x

IF the Seventh & Eighth digits of your cell phone are 02 or 20 this means your cell phone was assembled in Emirates which is very Bad quality
IF the Seventh & Eighth digits of your cell phone are 08 or 80 this means your cell phone was manufactured in Germany which is fair quality
IF the Seventh & Eighth digits of your cell phone are 01 or 10 this means your cell phone was manufactured in Finland which is very Good
IF the Seventh & Eighth digits of your cell phone are 00 this means your cell phone was manufactured in original factory which is the best Mobile Quality
IF the Seventh & Eighth digits of your cell phone are 13 this means your cell phone was assembled in Azerbaijan which is very Bad quality and also dangerous for your health

How to Save Bookmarks in IE, Firefox, Chrome and Opera

Save BookmarksHow would you like to save your bookmarks in IE, Firefox, Opera and Google Chrome so that you can restore them in case if you need to re-install your operating system or move them from one computer to another? This post will show you how to save and restore bookmarks in simple steps.
Bookmarking the favorite web pages can save a lot of time as it becomes almost impossible to remember a list of favorite websites and their URLs. However it can be really frustrating at times when you lose all those saved bookmarks in case if a computer crashes. Also if you are a person who uses more than one computer then it becomes hard to copy all those saved bookmarks one by one manually. So saving the bookmarks can become handy in such situations. Here is how to to do it.
 

Saving a Bookmark file in Internet Explorer

1. From the File menu, select the option Import and Export.
2. Select the option Export to a file and click on Next.
3. In the next screen select Favorites and click on Next.
4. In the next screen again click on Favorites and click on Next.
5. Now choose the destination where you want to save your bookmarks and click on Export.
6. In the next screen click on Finish.
Now you have successfully saved all your bookmarks in a .HTM file. You can use this file to later restore the bookmarks to either IE, Firefox or any other browser. To import the saved bookmarks from a file all you need to do is goto File menu, click on Import and Export, select the option Import from a file and proceed with the screen instructions.
 

Saving a Bookmark file in Firefox

1. From the Bookmarks menu on the top select the option Organize Bookmarks.
2. A window will pop-up. From the window click on Import and Backup at the top and select the optionExport HTML.
3. Now choose the destination where you want to save the bookmark file and click on Save.
To restore this saved file, follow the step-1 and in step-2 select the option Import HTML instead of Export HTML and proceed.
 

Saving a Bookmark file in Google Chrome

1. From the Tools menu, select Bookmark Manager.
2. Click the Organize menu in the manager.
3. Select Export bookmarks.
4. Select the location where you want your exported file to be saved, then click Save
To restore the bookmarks, follow step-1, step-2 and in step-3 select Import bookmarks instead ofExport bookmarks and proceed.
 

Saving a Bookmark file in Opera

1. From the File menu, select the option Import and Export.
2. Scroll over to the pull-down menu on the right and choose Export Bookmarks as HTML.
3. On the next screen, choose the destination folder from the Save in menu text box at the top of the screen.
4. Just click the Save button and you’re done

The Origin of "Google" comes from "Googol"

While stumbling I have read a story about the origin of one of the most powerful hand in the web today, Google.com. How big the company today, maybe multi-billion worth of search results. From the September 15, 1997, google.com was registered, then the rest is history. Read the whole story after the break.
c

Monday, August 2, 2010

Top 10 Differences Between a LCD TV and a Plasma TV

plasmavslcdPlasma TV and LCD TV are the two fighters in the current scenario of TV markets with the CRT TVs dumped in the corner of the shops. And everyone seems to be waiting to buy one of these large but slim and flat displays. But what is the difference between them? Have we ever thought that where does the difference lie between these two similar looking gadgets? What the pros and cons of each technology? Here is a simplified guide to these complex technologies that you must read to know the truth.

How hackers break into 'secure' websites

LAS VEGAS: Researchers have uncovered new ways that criminals can spy on Internet users even if they're using secure connections to banks, online retailers or other sensitive Web sites. 

The attacks demonstrated at the Black Hat conference here show how determined hackers can sniff around the edges of encrypted Internet traffic to pick up clues about what their targets are up to. 

It's like tapping a telephone conversation and hearing muffled voices that hint at the tone of the conversation. 

The problem lies in the way Web browsers handle Secure Sockets Layer, or SSL, encryption technology, according to Robert Hansen and Josh Sokol, who spoke to a packed room of several hundred security experts. 

Encryption forms a kind of tunnel between a browser and a website's servers. It scrambles data so it's indecipherable to prying eyes. 

SSL is widely used on sites trafficking in sensitive information, such as credit card numbers, and its presence is shown as a padlock in the browser's address bar. 

SSL is a widely attacked technology, but the approach by Hansen and Sokol wasn't to break it. They wanted to see instead what they could learn from what are essentially the breadcrumbs from people's secure Internet surfing that browsers leave behind and that skilled hackers can follow. 

Their attacks would yield all sorts of information. It could be relatively minor, such as browser settings or the number of Web pages visited. It could be quite substantial, including whether someone is vulnerable to having the "cookies" that store usernames and passwords misappropriated by hackers to log into secure sites. 

Hansen said all major browsers are affected by at least some of the issues. 

"This points to a larger problem — we need to reconsider how we do electronic commerce," he said in an interview before the conference, an annual gathering devoted to exposing the latest computer-security vulnerabilities. 

For the average Internet user, the research reinforces the importance of being careful on public Wi-Fi networks, where an attacker could plant himself in a position to look at your traffic. For the attacks to work, the attacker must first have access to the victim's network. 

Hansen and Sokol outlined two dozen problems they found. They acknowledged attacks using those weaknesses would be hard to pull off. 

The vulnerabilities arise out of the fact people can surf the Internet with multiple tabs open in their browsers at the same time, and that unsecured traffic in one tab can affect secure traffic in another tab, said Hansen, chief executive of consulting firm SecTheory. Sokol is a security manager at National Instruments Corp. 

Their talk isn't the first time researchers have looked at ways to scour secure Internet traffic for clues about what's happening behind the curtain of encryption. It does expand on existing research in key ways, though. 

"Nobody's getting hacked with this tomorrow, but it's innovative research," said Jon Miller, an SSL expert who wasn't involved in the research. 

Miller, director of Accuvant Labs, praised Hansen and Sokol for taking a different approach to attacking SSL. 

"Everybody's knocking on the front door, and this is, 'let's take a look at the windows,'" he said. "I never would have thought about doing something like this in a million years. I would have thought it would be a waste of time. It's neat because it's a little different." 

Sunday, August 1, 2010

Introducing Computer Hardware

Please Do Click on the Link To know about : (Note:- In Some Browser You have to Right click and select "Open in New Tab/Window")



Input/Output Devices


In computinginput/output, or I/O, refers to the communication between an information processing system (such as a computer), and the outside world possibly a human, or another information processing system. Inputs are the signals or data received by the system, and outputs are the signals or data sent from it. The term can also be used as part of an action; to "perform I/O" is to perform an input or output operation. I/O devices are used by a person (or other system) to communicate with a computer. For instance, a keyboard or a mouse may be an input device for a computer, while monitors and printers are considered output devices for a computer. Devices for communication between computers, such as modems and network cards, typically serve for both input and output.
Note that the designation of a device as either input or output depends on the perspective. Mouse and keyboards take as input physical movement that the human user outputs and convert it into signals that a computer can understand. The output from these devices is input for the computer. Similarly, printers and monitors take as input signals that a computer outputs. They then convert these signals into representations that human users can see or read. For a human user the process of reading or seeing these representations is receiving input. These interactions between computers and humans is studied in a field called human–computer interaction.
In computer architecture, the combination of the CPU and main memory (i.e. memory that the CPU can read and write to directly, with individual instructions) is considered the brain of a computer, and from that point of view any transfer of information from or to that combination, for example to or from a disk drive, is considered I/O. The CPU and its supporting circuitry provide memory-mapped I/O that is used in low-level computer programming in the implementation of device drivers. An I/O algorithm is one designed to exploit locality and perform efficiently when data reside on secondary storage, such as a disk drive.

Interface

I/O Interface is required whenever the I/O device is driven by the processor. The interface must have necessary logic to interpret the device address generated by the processor. Handshaking should be implemented by the interface using appropriate commands like (BUSY,READY,WAIT), and the processor can communicate with I/O device through the interface. If different data formats are being exchanged, the interface must be able to convert serial data to parallel form and vice-versa. There must be provision for generating interrupts and the corresponding type numbers for further processing by the processor if required
A computer that uses memory-mapped I/O accesses hardware by reading and writing to specific memory locations, using the same assembler language instructions that computer would normally use to access memory.

Higher-level implementation

Higher-level operating system and programming facilities employ separate, more abstract I/O concepts and primitives. For example, most operating systems provide application programs with the concept of files. The C and C++programming languages, and operating systems in the Unix family, traditionally abstract files and devices as streams, which can be read or written, or sometimes both. The C standard library provides functions for manipulating streams for input and output.
In the context of the ALGOL 68 programming language, the input and output facilities are collectively referred to as transput. The ALGOL 68 transput library recognizes the following standard files/devices: stand instand outstand errors and stand back.
An alternative to special primitive functions is the I/O monad, which permits programs to just describe I/O, and the actions are carried out outside the program. This is notable because the I/O functions would introduce side-effects to any programming language, but now purely functional programming is practical.

Addressing mode

There are many ways through which data can be read or stored in the memory. Each method is an addressing mode, and has its own advantages and limitations.
There are many type of addressing modes such as direct addressing, indirect addressing, immediate addressing, index addressing, based addressing, based-index addressing, implied addressing, etc.

Direct address

In this type of address of the data is a part of the instructions itself. When the processor decodes the instruction, it gets the memory address from where it can be read/store the required information.
Mov Reg. [Addr]
Here the Addr operand points to a memory location which holds the data and copies it into the specified Register.

Indirect address

Here the address can be stored in a register. The instructions will have the register which has the address. So to fetch the data, the instruction must be decoded appropriate register selected. The contents of the register will be treated as the address using this address appropriate memory location is selected and data is read/written.

Generation Of Computer ?

First-generation machines

 Even before the ENIAC was finished, Eckert and Mauchly recognized its limitations and started the design of a stored-program computer, EDVAC. John von Neumann was credited with a widely circulated report describing the EDVAC design in which both the programs and working data were stored in a single, unified store. This basic design, denoted the von Neumann architecture, would serve as the foundation for the worldwide development of ENIAC's successors. In this generation of equipment, temporary or working storage was provided by acoustic delay lines, which used the propagation time of sound through a medium such as liquid mercury (or through a wire) to briefly store data. A series of acoustic pulses is sent along a tube; after a time, as the pulse reached the end of the tube, the circuitry detected whether the pulse represented a 1 or 0 and caused the oscillator to re-send the pulse. Others used Williams tubes, which use the ability of a small cathode-ray tube (CRT) to store and retrieve data as charged areas on the phosphor screen. By 1954, magnetic core memory was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s.
EDVAC was the first stored-program computer designed; however it was not the first to run. Eckert and Mauchly left the project and its construction floundered. The first working von Neumann machine was the Manchester "Baby" or Small-Scale Experimental Machine, developed by Frederic C. Williams and Tom Kilburn at the University of Manchester in 1948 as a test bed for the Williams tube; it was followed in 1949 by the Manchester Mark 1 computer, a complete system, using Williams tube and magnetic drum memory, and introducing index registers. The other contender for the title "first digital stored-program computer" had been EDSAC, designed and constructed at the University of Cambridge. Operational less than one year after the Manchester "Baby", it was also capable of tackling real problems. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer), the successor to ENIAC; these plans were already in place by the time ENIAC was successfully operational. Unlike ENIAC, which used parallel processing, EDVAC used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view Manchester Mark 1 / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture. Manchester University's machine became the prototype for the Ferranti Mark 1. The first Ferranti Mark 1 machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.
The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of Sergei Alekseyevich Lebedev from Kiev Institute of ElectrotechnologySoviet Union (now Ukraine). The computer MESM (МЭСМSmall Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was CSIRAC, an Australian design that ran its first test program in 1949. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music.


Commercial computers

The first commercial computer was the Ferranti Mark 1, which was delivered to the University of Manchester in February 1951. It was based on the Manchester Mark 1. The main improvements over the Manchester Mark 1 were in the size of the primary storage (using random access Williams tubes), secondary storage (using a magnetic drum), a faster multiplier, and additional instructions. The basic cycle time was 1.2 milliseconds, and a multiplication could be completed in about 2.16 milliseconds. The multiplier used almost a quarter of the machine's 4,050 vacuum tubes (valves). A second machine was purchased by the University of Toronto, before the design was revised into the Mark 1 Star. At least seven of the these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.
In October 1947, the directors of J. Lyons & Company, a British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. The LEO I computer became operational in April 1951  and ran the world's first regular routine office computer job. On 17 November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO (Lyons Electronic Office). This was the first business application to go live on a stored program computer.
In June 1951, the UNIVAC I (Universal Automatic Computer) was delivered to the U.S. Census Bureau. Remington Rand eventually sold 46 machines at more than $1 million each ($8.38 million as of 2010). UNIVAC was the first "mass produced" computer. It used 5,200 vacuum tubes and consumed 125 kW of power. Its primary storage was serial-access mercury delay lines capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words). A key feature of the UNIVAC system was a newly invented type of metal magnetic tape, and a high-speed tape unit, for non-volatile storage. Magnetic media are still used in many computers. In 1952, IBM publicly announced the IBM 701 Electronic Data Processing Machine, the first in its successful 700/7000 series and its first IBM mainframe computer. The IBM 704, introduced in 1954, used magnetic core memory, which became the standard for large machines. The first implemented high-level general purpose programming languageFortran, was also being developed at IBM for the 704 during 1955 and 1956 and released in early 1957. (Konrad Zuse's 1945 design of the high-level language Plankalkül was not implemented at that time.) A volunteer user group, which exists to this day, was founded in 1955 to share their software and experiences with the IBM 701.
IBM 650 front panel
IBM introduced a smaller, more affordable computer in 1954 that proved very popular. The IBM 650 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 ($4.05 million as of 2010) or could be leased for $3,500 a month ($30 thousand as of 2010). Its drum memory was originally 2,000 ten-digit words, later expanded to 4,000 words. Memory limitations such as this were to dominate programming for decades afterward. The program instructions were fetched from the spinning drum as the code ran. Efficient execution using drum memory was provided by a combination of hardware architecture: the instruction format included the address of the next instruction; and software: the Symbolic Optimal Assembly Program, SOAP,assigned instructions to optimal address (to the extent possible by static analysis of the source program). Thus many instructions were, when needed, located in the next row of the drum to be read and additional wait time for drum rotation was not required.
In 1955, Maurice Wilkes invented microprogramming, which allows the base instruction set to be defined or extended by built-in programs (now called firmware ormicrocode). It was widely used in the CPUs and floating-point units of mainframe and other computers, such as the Manchester Atlas  and the IBM 360 series.
IBM introduced its first magnetic disk systemRAMAC (Random Access Method of Accounting and Control) in 1956. Using fifty 24-inch (610 mm) metal disks, with 100 tracks per side, it was able to store 5 megabytes of data at a cost of $10,000 per megabyte ($80 thousand as of 2010).

[edit]Second generation: transistors

The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Initially the only devices available were germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power. The first transistorised computer was built at the University of Manchester and was operational by 1953; a second version was completed there in April 1955. The later machine used 200 transistors and 1,300 solid-state diodes and had a power consumption of 150 watts. However, it still required valves to generate the clock waveforms at 125 kHz and to read and write on the magnetic drum memory, whereas the Harwell CADET operated without any valves by using a lower clock frequency, of 58 kHz when it became operational in February 1955. Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine's mean time between failures was about 90 minutes, but this improved once the more reliable bipolar junction transistors became available.
Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. Transistors greatly reduced computers' size, initial cost, and operating cost. Typically, second-generation computers were composed of large numbers of printed circuit boards such as the IBM Standard Modular System each carrying one to four logic gates or flip-flops.
A second generation computer, the IBM 1401, captured about one third of the world market. IBM installed more than one hundred thousand 1401s between 1960 and 1964.
This RAMAC DASD is being restored at theComputer History Museum
Transistorized electronics improved not only the CPU (Central Processing Unit), but also the peripheral devices. The IBM 350RAMAC was introduced in 1956 and was the world's first disk drive. The second generation disk data storage units were able to store tens of millions of letters and digits. Next to the fixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removable disk stack can be easily exchanged with another stack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks,' their interchangeability guarantees a nearly unlimited quantity of data close at hand. Magnetic tape provided archival capability for this data, at a lower cost than disk.
Many second generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlled card reading and punching, the main CPU executed calculations and binary branch instructions. One databus would bear data between the main CPU and core memory at the CPU's fetch-execute cycle rate, and other databusses would typically serve the peripheral devices. On the PDP-1, the core memory's cycle time was 5 microseconds; consequently most arithmetic instructions took 10 microseconds (100,000 operations per second) because most operations took at least two memory cycles; one for the instruction, one for theoperand data fetch.
During the second generation remote terminal units (often in the form of teletype machines like a Friden Flexowriter) saw greatly increased use. Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center. Eventually these stand-alone computer networks would be generalized into an interconnected network of networks—the Internet.

[edit]Post-1960: third generation and beyond

The explosion in the use of computers began with "third-generation" computers, making use of Jack St. Clair Kilby's and Robert Noyce's independent invention of the integrated circuit (or microchip), which later led to the invention of the microprocessor, by Ted HoffFederico Faggin, and Stanley Mazor at Intel. The integrated circuit in the image on the right, for example, an Intel 8742, is an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM, 2048 bytes of EPROM, and I/O in the same chip.
During the 1960s there was considerable overlap between second and third generation technologies. IBM implemented its IBM Solid Logic Technology modules in hybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. The Burroughs large systems such as the B5000 were stack machines, which allowed for simpler programming. These pushdown automatons were also implemented in minicomputers and microprocessors later, which influenced programming language design. Minicomputers served as low-cost computer centers for industry, business and universities. It became possible to simulate analog circuits with the simulation program with integrated circuit emphasis, or SPICE (1971) on minicomputers, one of the programs for electronic design automation (EDA). The microprocessor led to the development of the microcomputer, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, co-founder of Apple Computer, is sometimes erroneously credited with developing the first mass-market home computers. However, his first computer, the Apple I, came out some time after the MOS Technology KIM-1 andAltair 8800, and the first Apple computer with graphic and sound capabilities came out well after the Commodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments.
Systems as complicated as computers require very high reliability. ENIAC remained on, in continuous operation from 1947 to 1955, for eight years before being shut down. Although a vacuum tube might fail, it would be replaced without bringing down the system. By the simple strategy of never shutting down ENIAC, the failures were dramatically reduced. Hot-pluggable hard disks, like the hot-pluggable vacuum tubes of yesteryear, continue the tradition of repair during continuous operation. Semiconductor memories routinely have no errors when they operate, although operating systems like Unix have employed memory tests on start-up to detect failing hardware. Today, the requirement of reliable performance is made even more stringent when server farms are the delivery platform. Google has managed this by using fault-tolerant software to recover from hardware failures, and is even working on the concept of replacing entire server farms on-the-fly, during a service event.
In the twenty-first century, multi-core CPUs became commercially available. Content-addressable memory (CAM) has become inexpensive enough to be used in networking, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specific. Semiconductor memory cell arrays are very regular structures, and manufacturers prove their processes on them; this allows price reductions on memory products. During the 1980's, CMOS logic gates developed into devices that could be made as fast as other circuit types; computer power consumption could therefore be decreased dramatically. Unlike the continuous current draw of a gate based on other logic types, a CMOS gate only draws significant current during the 'transition' between logic states, except for leakage.
This has allowed computing to become a commodity which is now ubiquitous, embedded in many forms, from greeting cards and telephones to satellites. Computing hardware and its software have even become a metaphor for the operation of the universe. Although DNA-based computing and quantum qubit computing are years or decades in the future, the infrastructure is being laid today, for example, with DNA origami on photolithography.Fast digital circuits (including those based on Josephson junctions and rapid single flux quantum technology) are becoming more nearly realizable with the discovery of nanoscale superconductors.
An indication of the rapidity of development of this field can be inferred by the history of the seminal article. By the time that anyone had time to write anything down, it was obsolete. After 1945, others read John von Neumann's First Draft of a Report on the EDVAC, and immediately started implementing their own systems. To this day, the pace of development has continued, worldwide.