What is the Difference Between WiFi and Wireless Internet?

Wireless Internet is just one of the services that WiFi optionally supports. WiFi is a wireless communication standard used between computer devices to share files and resources. The signal cannot travel long distances without loss of integrity, and it is therefore used for Local Area Networks (LANs). In the home, the LAN might include a personal desktop system and laptop, while in the workplace, a the network commonly connects numerous computers within a commercial building. The signal might also cover a small region within a city, creating hot spots or places where the signal allows connectivity to the public through wireless access points (WAPs).

A WiFi network is very easy to set up. The main computer acts as a server with a network interface card (NIC). The NIC features a small antenna that broadcasts and receives WiFi signals. A router and switch direct traffic on the network and are commonly built into a high-speed modem to integrate this Internet into the WiFi LAN. Each computer connected to the network, referred to as a client, also requires a WiFi NIC.

Personal digital assistants, cell phones, and other handheld electronics commonly have WiFi ability built-in. This allows them to connect wirelessly to a WiFi-enabled network to transfer files, access data, or surf the Internet.

WiFi formerly stood for “Wireless Fidelity,” but the Wi-Fi Alliance that designed the standard is moving away from that designation. The standard exists so that manufacturers can produce interoperable components that will be compatible in this environment. If not for this common standard, each manufacturer would have proprietary WiFi, making it very difficult for consumers to buy equipment. Every network would have to be built around a single brand name. Moreover, individual networks of different brands would have no way to communicate with one another, and public access strategies would be all but impossible.

Since the standard is always improving, different versions represent the standard at different phases of evolution. Standard 802.11a saw some success, but operates in the 5-gigahertz (GHz) range, requiring virtual line-of-sight operation. The first widely adopted WiFi standard was 802.11b, which uses the 2.4 GHz range – a lower frequency that does not require near line-of-sight operation.
Standard 802.11g followed with an increased maximum data rate transfer from 802.11b’s 11 megabits per second (mbps) to 54 mbps. As of fall 2006, the newest draft standard, 802.11n, increases this rate to 540 mbps. Signals can successfully transmit data without loss of integrity from roughly 100 to 160 feet (30 to 50 meters), depending on the version used.
Security can be a concern with these technologies, as eavesdroppers can monitor unprotected data traffic. However, secure configuration is basic to these networks, and users can enable password protections and traffic encryption by following accompanying software instructions.