Jump to content

Drugs

From Simple English Wikipedia, the free encyclopedia

Drugs change the way the body works. It could mean:

  • Medicine - Special chemicals or herbs given to people by a doctor when they are sick
  • Illegal drugs - Chemicals, pills, liquids, or parts of plants that people take to make them feel a certain way