• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Liutauras Vilda
  • Junilu Lacar
  • Jeanne Boyarsky
  • Bear Bibeault
Sheriffs:
  • Knute Snortum
  • Tim Cooke
  • Devaka Cooray
Saloon Keepers:
  • Ron McLeod
  • Stephan van Hulst
  • Tim Moores
  • Tim Holloway
  • Carey Brown
Bartenders:
  • Piet Souris
  • Frits Walraven
  • Ganesh Patekar

Canadian Airport Codes

 
Ranch Hand
Posts: 1759
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Why do all Canadian Airport Codes start with a 'Y' ?

Toronto is YYZ
Ottawa is YOW
Vancouver is YVR
Montr´┐Żal-Dorval is YUL
Calgary is YYC
Edmonton YEG
Saskatoon YXE
London YXU
Victoria YYJ

I can't see that it has anything to do with the French language. I mean Canada in French is still Canada.

American codes all start with K; the K is ignored in general.

Thus, Sea-Tac (SEA) is actually KSEA; Paine Field (PAE) is KPAE, San Francisco (SFO) is actually KSFO, etc.

There are so many airports in America, however, that most of the smaller ones don't have descriptive abbreviations. Jefferson County, for instance, is 0S9, while Langley is W10.
[ August 16, 2004: Message edited by: Helen Thomas ]
 
Ranch Hand
Posts: 382
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
And the airport code for Sioux City, Iowa is SUX.
 
lowercase baba
Posts: 12766
51
Chrome Java Linux
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
According to this site, it's simply because at the time, Canada wanted uniformity in all their codes, and Y was not commonly used.
 
Ranch Hand
Posts: 305
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Originally posted by Helen Thomas:
American codes all start with K; the K is ignored in general.



Helen... American airports don't all begin with 'K', unless there are different codes for the same airports. Or I misunderstood completely (always a possibility)

Here are a few that don't make much sense either.
Nashville, TN (BNA)
New Orleans, LA (MSY)
Newark, NJ (EWR)
 
Helen Thomas
Ranch Hand
Posts: 1759
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
ORY PARIS, FRANCE

YHZ HALIFAX, NS, CANADA


Most of these codes are IATA "luggage tag" codes. The ICAO "route-planning" code is used for GPS navigation too; for the US, add a K to the start of the IATA code -- for example, KBED is Hanscom Field, Bedford, MA; KFFC is Falcon Field, Peachtree City, GA.



There's a joke/true story of an overweight woman who rang the airline she flew with and asked them if they started identifying luggage with descriptions of their owners. The airline had slapped a FAT sticker on her luggage - she had travelled to FRESNO, CA.

Airport ABCs: An Explanation of Airport Identifier Codes

Thus Aberdeen, Scotland, has the International Civil Aviation Organization (ICAO) location indicator of EGPD -- E for Northern Europe, G for United Kingdom, P for Scottish region, and D for Dyce field. Want to figure out LFPG? It's L for southern Europe, F for France, P for Paris FIR, and G for Charles de Gaulle airport. Easy! One more example is EDMM. E for northern Europe, D for Deutchland (Germany), M for Munchen (Munich) FIR, and M again for the Munich airport.

So if London Heathrow has two codes -- and it does, LHR and EGLL -- how come I've heard Chicago O'Hare only called ORD? The answer is unique to the United States. In the 48 contiguous States the ICAO code is formed simply by adding a "K" to the FAA code. This explains why international flight plans refer to KORD, KMIA, KJFK, etc.



But still no explanation of the 'Y' prefix for Canadian Airports.
[ August 16, 2004: Message edited by: Helen Thomas ]
 
Wanderer
Posts: 18671
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
[HST]: But still no explanation of the 'Y' prefix for Canadian Airports.

Did you read Fred's post?
 
Helen Thomas
Ranch Hand
Posts: 1759
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
When it comes to Canadian airports, for instance, the codes begin with a Y.

It's not that we were far down on the list, it's just that at the time, Y wasn't much used in codes worldwide, so Canadian aviation authorities chose it as a basis for creating uniform and distinctive nationwide codes.


And that's it? Thanks, fred and Jim.

Here's another lot of reasons but they mostly lack credibility. Take your pick. I'll settle for fred's.
 
Jim Yingst
Wanderer
Posts: 18671
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually, the real reason is that is was a necessary step in the development of the universe, in order that the Morse code representation of Toronto Pearson International Airport could provide rhytmic inspiration to Geddy Lee and Neal Peart as they were traveling back to their hometown. This resulted in YYZ, which is more than sufficient justification for Canada's naming policy. (Annoyingly I couldn't find a link which included the Morse-code-inspired beginning to this piece, so you'll just have to take my word for it.)
 
Helen Thomas
Ranch Hand
Posts: 1759
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
That's Hardly surprising. That's probably all Canada had going for it then.
I'm not sure whether the top award for full nerdy geekiness goes to the band or to you, Jim ?

2112 VTY port
[ August 16, 2004: Message edited by: Helen Thomas ]
 
Leverager of our synergies
Posts: 10065
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I was reading Chuck Palahniuk's "Fight Club", and it has three translations into Russian, all available on the Internet, if you know where to look. In Chapter 3, he goes through lots of airport codes. I know that it took me a few years and a few blogs to figure this thing out, so I wondered how the translators would deal with it. All three transcribed the abbreviations, so LAX became something like "el ei ix". One translator explained in the comments that these are airport codes. Another carefully listed all the codes used, with full explanations to which city they belong, inner lines or international. The third ignored the whole thing, so the reader was left to run his own investigations.
 
Ranch Hand
Posts: 5093
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
US ICAO codes all start with a K, US IATA codes are usually (but not always) the ICAO code minus the K.
Alaska codes start with PA (Pacific, Alaska), the P is dropped for IATA codes.
Hawaii codes start with PH, dropping the P again for IATA codes.

Canadian ICAO codes start with CY, like the US they drop the first character to get the IATA code, leading that to start with a Y.


ICAO codes were originally created to allow longrange radio transmissions to easily identify their station.
At a time when for example London, Ontario and London, UK might be received both from say Reykjavik, a unique identification code was needed to avoid confusion.
In the 1920s and 1930s a system was derived in which each country was given a block of 4 letter codes.
Some countries got several entire letter blocks (like the US getting K and N, and I think more than that), others got smaller blocks (Netherlands got EH and 2 more for example) for their smaller needs.
These codes are still reflected in the ICAO airport codes today, and in many countries as part of aircraft registration codes (for example EI for Ireland, CN for Chile, N for the US, etc.).
The idea that it stems from the availability of a weather station is therefore not entirely incorrect (but incomplete).
The weather station indicates there's a radio transmitter at the field which will have given it a radio identification code. Of course the reasoning is not complete as other types of transmitters will also have had a field receiving a code.

In some countries (most notably the US) there are also fields that have no ICAO or IATA codes. These are fields with no scheduled air traffic, principly small general aviation fields and military bases.
These fields are given codes by local aviation authorities (in the US either the FAA or state transportation departments).
[ August 17, 2004: Message edited by: Jeroen Wenting ]
 
ranger
Posts: 17344
11
Mac IntelliJ IDE Spring
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Originally posted by Jim Yingst:
Actually, the real reason is that is was a necessary step in the development of the universe, in order that the Morse code representation of Toronto Pearson International Airport could provide rhytmic inspiration to Geddy Lee and Neal Peart as they were traveling back to their hometown. This resulted in YYZ, which is more than sufficient justification for Canada's naming policy. (Annoyingly I couldn't find a link which included the Morse-code-inspired beginning to this piece, so you'll just have to take my word for it.)



Darn, you stole my response.

Mark
 
With a little knowledge, a cast iron skillet is non-stick and lasts a lifetime.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!