Center-right

From Simple English Wikipedia, the free encyclopedia

Center-right politics are a set of opinions about politics that usually but not always agree with right-wing politics and when it doesn't is still normally more right-wing than people who support left-wing politics are. Right-centrists can be Rockefeller Republicans, Red Tories or conservatives who are pro-life.