Women's college
Appearance
(Redirected from Women's colleges in the United States)

Women's colleges in higher education are colleges whose students are all or almost all women. They are often undergraduate, bachelor's degree-granting liberal arts colleges.