When did the England women's national football team win their first major championship?
1984
2009
2015
2022
Answer
The England women's national football team won their first major championship in 2022. They won the UEFA Women's Championship, marking a historic moment for the team and women's football in England.
Test Your Knowledge: England Women's National Football Team Trivia Challenge