Wi-Fi and Ethernet are LANs, Local Area Networks (e.g. a room)
Many computers, 1 shared medium
Ethernet - shared wire
Wi-Fi - shared radio channel
1. Every computer has its own address (addr)
2. Every computer listens to the shared channel all the time
Packet "to:" field is the addr of the intended recipient
How to send: wait for silence, broadcast the packet
Everyone gets the packet - what "shared" means
How to receive: look for a packet to: your addr
Suppose the router has a packet that's supposed to go to computer2
The router broadcasts that packet .. it goes to everyone on the LAN
The non-intended recipients are supposed to ignore the packet
1. Busy LAN
Suppose many many people are using the LAN
(alternately one person is making lots of traffic)
What does each computer observe?
The shared radio channel is busy with many packets (i.e. not silent)
Each computer gets fewer opportunities to use the channel
Basically each computer gets a smaller "slice" as there are more computers
The system slows but does not break
Computer receives a packet, but the checksum does not match
Maybe the receiver is at the outer-limit of radio range
What does this mean?
The packet was corrupted in transit
The receiver will request re-send of that packet
User observes: data comes through correct, but slowly (if many re-sends)
3. Bad Guy
Bad guy intercepts packets intended for others
Bad guy computer does not discard packets to: someone else
It's a shared medium, they really can see all the packets!
This is why we have encryption/https - a later lecture
"Bandwidth" speed of an internet connection - bits per second
A typical LAN speed: 100 megabits per second (wired or wireless)
Slow home internet connection is maybe 1 megabit
Fast home internet is 20 megabits or more
You local LAN speed it the fastest
Your "upstream" internet connection speed is typically slower
Recall 1500 byte packet, 1500 * 8 = 12000 bits
One strategy: convert to bits first
Q1: How many megabytes per second can a 100 mbps network send ignoring overhead?
Q2: How many seconds are required to send 1500 byte packet on 100 megabit network?
Q3: I have a 38 MB image. How long to send at 100 mbps?
Here computing the "ideal"/no-overhead speed
In reality, networking has a lot of overhead + possible sharing
Actual obtained speed might be 50-80% of ideal
Q1 1 megabyte is 8 million bits
How many times per second can we send 8 million bits?
100 million bits-per-sec / 8 million bits
100 / 8 = 12.5
12.5 MB per sec
In reality less due to overhead
Q2 1500 bytes is 12000 bits
12000 / 100 million is 0.00012 seconds
This packet ties up the wire very briefly!
Q3: We have 12.5 MB per second from Q1,
so 38 / 12.5 = about 3 seconds