France

France is a country located in Western Europe that is mentioned in The Walking Dead.

Post-Apocalypse
"It was the French. [...] They were the last ones to hold out as far as I know. While our people were bolting out the doors and committing suicide in the hallways, they stayed in the labs till the end."

- Dr Jenner France