Originally posted by Helen Thomas:
American codes all start with K; the K is ignored in general.
Most of these codes are IATA "luggage tag" codes. The ICAO "route-planning" code is used for GPS navigation too; for the US, add a K to the start of the IATA code -- for example, KBED is Hanscom Field, Bedford, MA; KFFC is Falcon Field, Peachtree City, GA.
Thus Aberdeen, Scotland, has the International Civil Aviation Organization (ICAO) location indicator of EGPD -- E for Northern Europe, G for United Kingdom, P for Scottish region, and D for Dyce field. Want to figure out LFPG? It's L for southern Europe, F for France, P for Paris FIR, and G for Charles de Gaulle airport. Easy! One more example is EDMM. E for northern Europe, D for Deutchland (Germany), M for Munchen (Munich) FIR, and M again for the Munich airport.
So if London Heathrow has two codes -- and it does, LHR and EGLL -- how come I've heard Chicago O'Hare only called ORD? The answer is unique to the United States. In the 48 contiguous States the ICAO code is formed simply by adding a "K" to the FAA code. This explains why international flight plans refer to KORD, KMIA, KJFK, etc.
Originally posted by Jim Yingst:
Actually, the real reason is that is was a necessary step in the development of the universe, in order that the Morse code representation of Toronto Pearson International Airport could provide rhytmic inspiration to Geddy Lee and Neal Peart as they were traveling back to their hometown. This resulted in YYZ, which is more than sufficient justification for Canada's naming policy. (Annoyingly I couldn't find a link which included the Morse-code-inspired beginning to this piece, so you'll just have to take my word for it.)