The Left Coast is a humorous conservative term for the West Coast, especially California. It first appeared in the early 1990s.
The Left Coast is a humorous conservative term for the West Coast, especially California. It first appeared in the early 1990s.