Boards
Is there any evidence organic food is better for you?
It struck me that it's become one of these accepted truths that it's healthier to eat organic food but is this belief something that's been proven or is it just a theory/feelgood thing?
It struck me that it's become one of these accepted truths that it's healthier to eat organic food but is this belief something that's been proven or is it just a theory/feelgood thing?