The definition of "healthy" has never been stable. It has been constructed, deconstructed, and sold back to us in a new package roughly every ten years. Doctors endorsed cigarettes. The government told us to eat nine servings of bread per day. An entire generation chased a body type that nutrition science now recognizes as a starvation response.
This is the full timeline — not a celebration of wellness culture, but an honest autopsy of it. Decade by decade, we will look at what "healthy" looked like, what the science actually said (or was paid to say), and who was writing the checks. The throughline is uncomfortable: we have been systematically misled about health for most of the modern era, and understanding that history is the first step toward finally getting it right.
"Every decade had its version of the truth. What changed was who was funding it."
The 1950s–60s: Clean Living, Smoking Doctors, and the Miracle of Processed Food
What "Healthy" Looked Like
Post-war prosperity had a very specific aesthetic. The healthy American was well-fed but not overweight — a distinction that mattered because thinness signaled poverty in an era of abundance. Men aimed for the broad-shouldered, lean-waisted look of Cary Grant or Gregory Peck. Women were expected to be curvaceous in the Marilyn Monroe mode — soft, womanly, nourished. Suburban life was the aspirational health ideal: fresh air, a house with a yard, a refrigerator full of modern food.
Physical activity was embedded in daily life more than celebrated as a separate practice. You mowed the lawn. You walked to the store. Formal exercise was for athletes and soldiers.
What the Science Said (and What We Were Told)
This is where it gets dark. In 1950, the New England Journal of Medicine published cigarette ads. The American Medical Association accepted tobacco advertising in its journals. The famous "More doctors smoke Camels" campaign ran from 1946 to 1952, featuring actual physicians endorsing a product we now know kills roughly one in two long-term users.
The processed food revolution was simultaneously being sold as a nutritional miracle. Wonder Bread was marketed as "building strong bodies 8 ways" (later upgraded to 12 ways). Crisco and Hydrox replaced animal fats with partially hydrogenated vegetable oils — an invention we now call trans fats, which would eventually be banned by the FDA in 2018 after being linked definitively to heart disease.
In 1955, President Eisenhower's heart attack triggered a national conversation about cardiovascular health. Dr. Ancel Keys, a physiologist at the University of Minnesota, stepped into the void with a theory: dietary fat caused heart disease. His landmark Seven Countries Study would become one of the most cited — and most criticized — nutrition studies of the 20th century. The problem wasn't that Keys was entirely wrong; it was that he was selectively right. He reportedly cherry-picked countries that fit his hypothesis and left out data from nations (like France and West Germany) that didn't.
In the same period, the sugar industry was quietly funding researchers to shift blame away from sugar and toward fat. Internal industry documents, unearthed by researchers at UCSF in 2016, revealed that the Sugar Research Foundation paid scientists in 1967 to publish a review in the NEJM downplaying the link between sugar and heart disease. It worked. For decades.
What We Got Right
The germ theory of disease had matured, vaccines were transforming child mortality, and the basic framework of infectious disease was genuinely understood. Sanitation improvements were delivering real, measurable gains in lifespan. The era got hygiene right even as it got nutrition spectacularly wrong.
What We Got Wrong
Almost everything about chronic disease. Cigarettes were prescribed. Trans fats were "heart-healthy." Sugar was neutral. The entire chronic disease foundation of American medicine was being laid on a bed of industry-funded sand.
The 1970s: The Jogging Revolution, Atkins Act One, and the Original Diet Crime
What "Healthy" Looked Like
The 1970s had two competing aesthetics. On one side: the counterculture ideal — lean, natural, granola-and-sprouts, anti-processed. On the other: the nascent running movement — lean, tanned, aerobically capable. Jim Fixx published The Complete Book of Running in 1977, selling over a million copies and inaugurating the modern era of recreational exercise. A healthy person, by the end of the decade, was someone who ran.
(Fixx died of a heart attack during a run in 1984 at 52. He had genetic coronary artery disease. The irony was weaponized by everyone who had never liked exercise.)
What the Science Said
The fat-is-evil narrative solidified. George McGovern's Senate Select Committee on Nutrition issued its landmark Dietary Goals for the United States in 1977 — the first official government dietary guidelines. They recommended reducing fat and increasing carbohydrates. The food industry, particularly the meat and dairy lobbies, pushed back hard. McGovern revised the language. But the framework was set: carbs were safe, fat was dangerous.
Meanwhile, Dr. Robert Atkins published Dr. Atkins' Diet Revolution in 1972. It was controversial and ridiculed by mainstream medicine. The American Medical Association called it a "bizarre regimen." But millions of people found it worked — because it did restrict calories, just through a different mechanism. Atkins was ahead of the science on insulin and carbohydrate metabolism, but the establishment wasn't ready to hear it.
The sugar research fraud continued operating in the background. Research funding was quietly shaping what questions got asked and which answers got published.
What We Got Right
Aerobic exercise. The 1970s established beyond reasonable doubt that cardiovascular fitness matters. The research on VO2 max, aerobic capacity, and heart health conducted in this era holds up remarkably well. Moving your body, vigorously and regularly, extends life. The decade got this right.
What We Got Wrong
Demonizing dietary fat wholesale. The nuanced truth — that trans fats and refined carbohydrates were the real villains, while whole-food fats from olive oil, nuts, and fatty fish were beneficial — took another thirty years to fully surface.
The 1980s: Aerobics, Arnold, and the Fat-Free Obsession
What "Healthy" Looked Like
The 1980s gave us two simultaneous and contradictory body ideals. For women: Jane Fonda in legwarmers — high-aerobics, leotard-wearing, fat-burning intensity. For men: Arnold Schwarzenegger's bodybuilder physique — mass, muscle, and the Terminator jawline. Both extremes coexisted without anyone noting the contradiction.
The decade operationalized exercise. VHS workout tapes sold by the millions. Jane Fonda alone sold 17 million copies. Richard Simmons made sweating accessible to everyone who wasn't already a gym rat. "Feeling the burn" became a cultural shorthand for doing it right.
What the Science Said
The low-fat hypothesis was now fully entrenched in policy. In 1984, the National Institutes of Health officially endorsed the idea that reducing dietary fat would reduce heart disease. Food manufacturers responded with extraordinary speed: the fat-free product category was born. Snackwell's cookies, fat-free yogurt, fat-free salad dressing. The catch? When you remove fat from food, it tastes like cardboard. The industry's solution was sugar. Lots of it. Products labeled "fat-free" or "low-fat" routinely had more sugar than their full-fat counterparts, and nobody was measuring what that did to insulin resistance at population scale.
The bodybuilding world was simultaneously developing a proto-version of what would later become sports nutrition science — experimenting with protein timing, creatine (not yet mainstream), and the relationship between resistance training and body composition. Most of it was empirical bro-science, but some of it was surprisingly close to what the research would eventually confirm.
The phrase "no pain, no gain" — popularized by Jane Fonda — established a cultural belief that suffering was the metric of effective exercise. Overtraining syndrome wasn't in the popular vocabulary. The idea that rest was part of fitness was decades away from mainstream acceptance.
What We Got Right
Resistance training. The research from this era on strength training, muscle mass, and metabolic rate remains foundational. Muscle, it turns out, is one of the most important longevity organs — a fact that the 2020s longevity movement would rediscover with evangelical fervor. The 1980s bodybuilding culture was accidentally ahead of the curve here.
What We Got Wrong
The fat-free-equals-healthy equation was industrial nutrition policy's greatest error. It redirected decades of eating behavior toward refined carbohydrates and added sugars, contributing to what became an obesity and metabolic syndrome epidemic. The science of the era wasn't even consistently pointing this direction — it was largely industry-shaped narrative operating with a veneer of government endorsement.
"Fat-free products were the single most successful food marketing innovation of the century. They were also, almost entirely, a lie."
The 1990s: Heroin Chic, the USDA's Pyramid Scheme, and Fat-Free Everything Peaks
What "Healthy" Looked Like
The 1990s split into two aesthetics that coexisted uneasily and both caused enormous harm. In fashion: heroin chic. Kate Moss's 1993 Calvin Klein campaign made extreme thinness — sunken eyes, protruding clavicle, visible ribs — the defining image of feminine beauty. President Clinton publicly criticized the trend in 1997, but it dominated fashion imagery and trickled into mainstream cultural standards throughout the decade. A generation of young women received the message that visible bone was aspirational.
In fitness, the decade saw the rise of the step class, the elliptical machine, and the early era of running as lifestyle sport. "Healthy" women were now expected to be both thin and toned — a combination that requires extraordinary dietary restriction or genetic luck.
What the Science Said
In 1992, the USDA released the original Food Pyramid. It placed bread, rice, cereal, and pasta at the base — 6 to 11 servings per day. Fat was at the top, used "sparingly." The pyramid was presented as settled science. It was not. It was a compromise document shaped by industry lobbying, particularly from the meat, dairy, and grain industries. Walter Willett at Harvard, one of the world's leading nutritional epidemiologists, called it a "nutritional disaster."
Fat-free products peaked commercially. Snackwell's Devil's Food cookies contained zero fat and sold in quantities that suggested Americans believed they were health food. The sugar content was ignored because the official guidelines said fat was the enemy.
Olestra — a fat substitute that passed through the body unabsorbed — was approved by the FDA in 1996 for use in snack foods. It caused, memorably and often, "anal leakage." Products had to carry warning labels. The era's ability to create novel solutions to problems it had largely invented was impressive.
The diet industry reached what was then its peak: a $33 billion annual business. Jenny Craig, Weight Watchers, Slim-Fast, and Nutrisystem collectively processed millions of customers through programs with high short-term success rates and even higher long-term failure rates. The research on weight cycling — the health consequences of repeatedly losing and regaining weight — was beginning to emerge but had not yet reached public consciousness.
What We Got Right
Mediterranean diet research. The 1990s saw serious scientific interest in why populations in southern Europe had dramatically lower rates of heart disease. The research pointed toward olive oil, whole grains, legumes, fish, and moderate red wine — a pattern that would eventually become one of the most robustly validated dietary frameworks in nutrition science. It was largely ignored in favor of fat-free cookies, but the science was there.
What We Got Wrong
Everything about the fat-free decade. And the cultural elevation of extreme thinness as a health ideal, which would leave lasting psychological damage on a generation now in their 30s and 40s trying to undo years of disordered eating patterns.
The 2000s: Size Zero, Atkins Returns, and the Organic Awakening
What "Healthy" Looked Like
The early 2000s doubled down on thinness while adding new vocabulary. "Size zero" became a cultural marker. Paris Hilton and Nicole Richie defined the era's celebrity body ideal — angular, prominent hip bones, visible clavicle. Tabloid culture created a new genre: before-and-after weight loss photos as aspirational content, celebrity weight tracked as obsessively as stock prices.
By mid-decade, a counter-movement was stirring. Beyoncé and Jennifer Lopez were reintroducing curves as aspirational. The fitness industry was beginning to shift from pure cardio toward what would become functional fitness. But the dominant cultural message was still: thinner equals healthier.
What the Science Said
Atkins returned — this time with better science behind it. Low-carbohydrate diets were now being studied in randomized controlled trials. The data was stubborn: people on low-carb diets lost weight, improved their triglycerides, and increased their HDL. The conventional wisdom was forced to acknowledge what it had dismissed for thirty years.
High-fructose corn syrup entered the research spotlight. Studies began connecting its widespread introduction into the American food supply (roughly 1975-1985) with the obesity epidemic's trajectory. The sugar industry's decades of research suppression were starting to become visible as independent scientists worked backward through the epidemiological data.
The organic food movement moved from counterculture to mainstream. Whole Foods Market went public in 1992 but exploded in the 2000s. The argument was twofold: organic food was better for the environment, and conventional produce with pesticide residues might be damaging human health. The science on this was (and remains) genuinely mixed — but the movement correctly identified that the industrialization of food supply had introduced variables nobody had studied at population scale.
The supplement industry grew to $20 billion annually, driven partly by the 1994 DSHEA Act which had deregulated dietary supplements and allowed them to be sold without FDA approval, requiring only post-market safety surveillance. The result: thousands of products making health claims that ranged from plausible to outright fraudulent, with minimal regulatory oversight.
What We Got Right
The early evidence on metabolic syndrome — the cluster of high blood pressure, high blood sugar, excess abdominal fat, and abnormal cholesterol that dramatically increases risk of heart disease and type 2 diabetes — was beginning to cohere. Researchers were starting to understand that the obesity epidemic wasn't just an aesthetic problem; it was a metabolic catastrophe in slow motion. This framing would become foundational to the 2020s longevity revolution.
What We Got Wrong
The size-zero era's damage to body image, eating disorder rates, and the normalization of extreme caloric restriction was enormous and measurable. Eating disorder hospitalizations rose sharply through the decade. The definition of "healthy" was still primarily aesthetic and external rather than functional and biomarker-based.
The 2010s: Strong Is the New Skinny, and the Data Revolution
What "Healthy" Looked Like
The 2010s were the decade of the body transformation. "Strong is the new skinny" entered the lexicon, driven by CrossFit's explosive growth, the rise of Instagram fitness culture, and Michelle Obama's highly visible commitment to strength training. The cultural ideal shifted — slightly — from emaciated to athletic. Abs replaced hip bones as the aspirational body metric.
But the shift was more superficial than it appeared. Instagram fitness culture created a new set of unattainable body standards, this time requiring both extreme leanness and visible muscle definition — a combination that in many cases required pharmaceutical assistance or extraordinary genetic advantage, rarely acknowledged in the content.
Wearable technology arrived. The Fitbit (2009, but mainstream by 2011-12) made step counts a cultural metric. The Apple Watch (2015) embedded health monitoring into luxury fashion. For the first time, "healthy" had data attached to it — though the data was mostly measuring outputs (steps, heart rate) rather than meaningful biomarkers.
What the Science Said
The microbiome revolution. A cascade of research from the Human Microbiome Project (2012) and subsequent studies established that the trillions of bacteria inhabiting the gut were not passengers but active participants in metabolism, immunity, and even mental health. The gut-brain axis — the bidirectional communication between the digestive system and the central nervous system — became a serious research area, with implications for depression, anxiety, and cognitive function. WellSourced has covered this in depth: The Gut-Brain Connection: What Your Microbiome Is Actually Doing to Your Mood.
Intermittent fasting moved from fringe to mainstream. Studies on time-restricted eating, alternate-day fasting, and the 5:2 protocol accumulated enough evidence to make IF a legitimate dietary intervention rather than a health food trend. The mechanisms — autophagy, metabolic switching, insulin sensitivity — gave it a scientific framework that previous diet trends had lacked.
The ketogenic diet experienced a research renaissance. Originally developed in the 1920s for pediatric epilepsy, keto was now being studied for metabolic syndrome, type 2 diabetes reversal, and neurological conditions. The evidence was genuinely promising in some areas and overstated in others, but the underlying science of ketone metabolism was solid.
Epigenetics entered mainstream wellness vocabulary. The discovery that lifestyle choices — diet, exercise, stress, sleep — could literally alter gene expression without changing DNA sequence gave the wellness industry a new scientific language. It was also, at the consumer level, frequently oversimplified and occasionally weaponized to sell supplements with dubious evidence bases.
Peptide science was quietly advancing in research settings. BPC-157, TB-500, and other peptides were being studied in animal models with remarkable results in tissue repair, inflammation reduction, and metabolic function. The research was mostly preclinical — no large-scale human trials — but the mechanistic data was compelling enough to attract serious attention. This is a story WellSourced has covered extensively: Peptides 101: A Beginner's Guide to What They Are and How They Work →
What We Got Right
Sleep. The 2010s produced the research that turned sleep from a soft wellness recommendation into a hard medical priority. Matthew Walker's work at UC Berkeley, popularized in Why We Sleep (2017), synthesized decades of sleep research into a compelling case: chronic sleep deprivation was a risk factor for virtually every chronic disease. Seven to nine hours wasn't laziness — it was physiology.
Strength training's longevity benefits were now undeniably established. Muscle mass emerged as one of the strongest predictors of all-cause mortality. The research on sarcopenia — age-related muscle loss — showed that the 1980s bodybuilders had been accidentally right about the most important thing: building and maintaining muscle was a longevity intervention.
What We Got Wrong
Instagram made health performance into a social currency. The comparison culture it enabled drove anxiety, disordered eating, and body dysmorphia at scale. The decade ended with growing evidence that social media use was correlated with declining mental health, particularly in adolescent girls — a finding with obvious irony given the era's "strong is the new skinny" branding.
"The 2010s gave us the most health data any generation had ever seen. We used most of it to take better photos at the gym."
The 2020s: Longevity, GLP-1s, Peptides, and Personalized Everything
What "Healthy" Looks Like Now
The aesthetic ideal of the 2020s is, for the first time in the modern era, genuinely complicated. "Healthy" is now partially decoupled from appearance. The longevity movement — driven by figures like Peter Attia, David Sinclair, and Andrew Huberman — defines health in terms of biomarkers, biological age, VO2 max, DEXA scans, and continuous glucose monitoring. Bryan Johnson's Blueprint project, whatever one thinks of its extreme implementation, represents something genuinely new: the quantification of aging as a health target.
Cold plunge culture proliferated. Red light therapy went from medical device to consumer product. Zone 2 training — sustained, moderate-intensity aerobic work — emerged as the longevity fitness modality of choice after decades of HIIT dominance. The message shifted from "burn maximum calories" to "optimize metabolic flexibility."
GLP-1 receptor agonists — semaglutide (Ozempic/Wegovy), tirzepatide (Mounjaro/Zepbound) — transformed both medicine and culture. For the first time, the biology of appetite and satiety was not a matter of willpower but of pharmacology. The cultural implications are still being processed: if obesity is a disease of hormonal dysregulation rather than moral failure, what does that mean for sixty years of diet culture? WellSourced covered the full arc in The GLP-1 Culture War: What Ozempic is Really Changing →
What the Science Says
Healthspan over lifespan. The field has largely moved away from the goal of simply living longer toward the goal of living better for longer — maintaining cognitive function, physical capacity, and metabolic health into the eighth and ninth decades. This reframe changes everything about how we measure success.
Peptide research has accelerated. The mechanistic data on certain peptides — their ability to modulate specific receptor pathways, support tissue repair, and influence metabolic function — is now robust enough that serious longevity clinics and researchers are paying close attention. The landscape is complex: some peptides have extensive human safety data, others are still largely preclinical. Understanding the difference matters. Start with our foundational guide if you're new to the space →
Continuous glucose monitoring went consumer. Devices like Levels and Abbott's Libre allow real-time tracking of blood glucose responses to food, exercise, and stress. The data is revealing extraordinary individual variation — the same meal can spike blood sugar dramatically in one person and barely register in another. Personalized nutrition, once a theoretical ideal, is now technically feasible.
The longevity biology behind interventions like metformin, rapamycin, NAD+ precursors, and senolytics is being actively studied in human trials. The science is ahead of the clinical guidelines and well behind the consumer market — a gap that requires careful navigation. We covered the research landscape in our Beginner's Guide to Longevity Science →
What We're Getting Right
The framing has fundamentally shifted. "Healthy" is increasingly defined by function, not form. Biomarkers matter more than BMI. Metabolic health — insulin sensitivity, inflammatory markers, lipid particle size, blood glucose variability — is becoming the language of preventive medicine, not just specialty clinics.
The honest acknowledgment of failure is also new. The research community has spent the last decade formally retracting or revising the bad science of the fat-free era, the sugar industry funding, and the USDA pyramid. Nutrition epidemiology is grappling seriously with its methodological limitations. The era of confident dietary pronouncements based on industry-funded cohort studies is (mostly) over.
What We're Getting Wrong
The longevity industry is already generating its own mythology. Cold plunges are good for recovery; they are not the cure for all inflammation. Continuous glucose monitoring provides useful data; obsessing over every postprandial spike is a pathway to anxiety and orthorexia, not health. Biological age tests using methylation clocks have significant measurement error. Bryan Johnson's supplement stack costs roughly $2,000 per month and has not demonstrated a single outcome in a randomized controlled trial.
The biohacking ecosystem has also, predictably, attracted the same bad actors who have populated every previous wellness wave — supplement companies selling products with implausible claims, influencers monetizing the performance of optimal health, and a market structure that rewards novelty and narrative over evidence.
The personalization revolution is also exacerbating health inequality. Comprehensive blood panels, continuous glucose monitors, and longevity clinics are expensive. The science of healthspan may be democratizing; the access to its interventions is not.
The Pattern, Decoded
Reading seventy years of health history consecutively makes the pattern visible in a way that living through individual decades obscures. Every era had three consistent features:
- An industry with a financial interest in the dominant health narrative. Tobacco. Sugar. Low-fat packaged food. Diet programs. Supplement companies. Each built a scientific infrastructure that supported its commercial interest.
- A cultural aesthetic that was presented as a health target. The conflation of beauty standards with health standards is not accidental. Thinness was marketed as both attractive and healthy. The two are not the same thing, but the confusion was profitable.
- Legitimate science that got partially or entirely buried. The Mediterranean diet evidence existed in the 1990s. The carbohydrate-insulin model had credible research behind it in the 1970s. Atkins was dismissed and was basically correct. The sugar-heart disease research was funded and suppressed. In every era, financial interests shaped which truths got amplified and which got delayed.
The question for the 2020s is whether the longevity and personalized medicine movement can break this pattern, or whether it is simply the latest version of the same structure with a different set of beneficiaries. The answer is probably: both. The science is better than it has ever been. The market will still try to exploit it.
What This Means for You
The practical implication of this history is not cynicism — it is calibration. It means:
- Be appropriately skeptical of consensus. Official guidelines have been wrong before, often because they were shaped by interests other than public health. This doesn't mean rejecting all guidance; it means applying the same critical thinking to health claims that you apply to any other domain where money is involved.
- Prioritize the interventions with the longest and most consistent evidence base. Exercise — particularly both cardiovascular fitness and resistance training. Sleep. Vegetables and whole foods. Stress management. These have survived every decade's revision. The newer interventions (peptides, metabolic monitoring, advanced supplementation) may prove equally robust; they haven't had seventy years to prove it yet.
- Separate appearance from health. The most consistent damage of the past seven decades has come from conflating these. Health is functional, metabolic, and biochemical. It often correlates with appearance but it is not the same thing.
- Demand evidence quality, not just volume. A hundred studies funded by the same industry are worth less than five independent, well-designed randomized controlled trials. Learning to read research quality — not just quantity — is the core skill of the modern health consumer.
"The most dangerous health information is usually true enough to be plausible and funded enough to be everywhere."
WellSourced exists because this history is real, and because the noise-to-signal ratio in health information has never been higher. We cover the science with the same skepticism we'd apply to any industry where billions of dollars are at stake — because in health, they always are.
Start with the fundamentals: Peptides 101 → | Longevity Science: Where to Start → | Wellness Trends Through the Decades →
Frequently Asked Questions
When did the "fat-free" food craze start and why?
The fat-free craze emerged in the mid-1980s and peaked in the early 1990s, following the 1984 NIH endorsement of the low-fat dietary hypothesis and the 1992 USDA Food Pyramid, which placed carbohydrates at the base and recommended using fats "sparingly." The hypothesis — that dietary fat caused heart disease — had been shaped partly by industry-funded research from the 1960s that shifted blame away from sugar. Food manufacturers responded by creating an entire category of fat-free and low-fat products, which typically compensated for the absence of fat with added sugar, contributing to the metabolic health crisis that followed.
What is the history of diet culture in America?
Diet culture in America has roots in the late 19th century but accelerated dramatically after World War II. The 1960s saw the founding of Weight Watchers (1963) and the beginning of the modern diet industry. The 1970s brought government dietary guidelines that prioritized carbohydrate reduction of fat. The 1980s saw the aerobics boom and the fat-free product wave. The 1990s represented peak diet culture — a $33 billion industry, heroin chic aesthetics, and the USDA pyramid encoding low-fat ideology into policy. Each decade built on the previous one's errors while generating new ones.
How have health standards changed from the 1980s to now?
The 1980s defined health primarily through appearance and cardiovascular endurance — the aerobics era. By the 2010s, the definition had expanded to include strength training, sleep quality, stress management, and gut health. By the 2020s, health is increasingly defined in terms of biomarkers: blood glucose variability, inflammatory markers, biological age, VO2 max, and metabolic flexibility. The shift is from appearance-based to function-based definitions — though the appearance ideal persists in culture even as the science has moved on.
What is "healthspan" and how is it different from lifespan?
Lifespan is simply how long you live. Healthspan is how long you live in good health — with cognitive function, physical capability, and metabolic health intact. Longevity medicine has shifted its focus to healthspan because simply extending lifespan without extending healthspan just means more years of frailty, cognitive decline, and chronic disease. The goal is to compress morbidity — to remain functional and vital until very late in life, then decline quickly rather than slowly.
Are GLP-1 drugs like Ozempic actually safe long-term?
GLP-1 receptor agonists have a longer safety record than most new drug classes realize — the first GLP-1 drug (exenatide) was approved in 2005, and semaglutide has been used for type 2 diabetes since 2017. The cardiovascular safety data is actually impressive: the SUSTAIN-6 and SELECT trials showed reduced major cardiovascular events in high-risk patients. The primary concerns — potential thyroid C-cell tumors (based on rodent data not replicated in humans), pancreatitis risk (small and contested), and muscle mass loss with rapid weight reduction — are real considerations that should be discussed with a physician. They are not approved for everyone, and they work best as part of a comprehensive metabolic health program.
What does the history of health trends tell us about peptides?
The history suggests appropriate caution combined with genuine interest. Peptides represent a mechanistically plausible class of compounds with a growing research base — but they are in an earlier stage of human evidence than interventions like exercise, Mediterranean diet, and sleep, which have decades of robust human trial data. The pattern of health history warns against both blanket dismissal (many dismissed interventions turned out to be valid) and uncritical adoption (many celebrated interventions turned out to be harmful or useless). Evaluating peptides rigorously, on their actual evidence, is the correct approach. Our Peptides 101 guide → is the best place to start.