What is the meaning of Colonialism?
The policy of a country seeking to extend or retain its authority over other people or territories, generally with the aim of economic dominance.
A colonial word, phrase, concept, or habit.
Colonial life.
colonialism
Source: wiktionary.org