Paradigm of Modern Medicine
SENTENCES:
Treating the symptoms of a disease rather than the disease itself hides the effects of unhealthy lifestyle choices. It enables, rather than heals, and has led to an overall sicker population.
PARAGRAPHS:
The way medicine is practiced today has led to a sicker population. While there has been success in treating acute issues, the framework through which we view chronic disease is misguided and harmful.
The use of medication as a crutch masks the symptoms of chronic disease, giving the appearance of healing, while the disease process continues unabated. We are enabling rather than healing.
Healing is a return to self-sustained equilibrium. Healing is a process, not a medication. The longer we can maintain a self-sustained equilibrium, the better chance we have of living a long and healthy life. Chronic disease develops overs years to decades, and symptoms are often not apparent until late in the disease process. In frustrating but elegant symmetry, healing occurs on a similar time frame.
Because of this symmetry, the earlier a disease is found, the easier it can be treated. Preventing disease by focusing on the pillars of health, specifically diet, exercise, sleep, and mood (mental health) is even better. Unfortunately, all the incentives imposed on physicians limit their ability to appropriately treat chronic disease.
There is a role for medication in medicine. Congenital disease can result in the complete malfunction of a vital function, such as in type 1 diabetes. Medication is often necessary in this case. The body has a remarkable ability to heal itself, but sometimes irreversible damage occurs, either from untreated chronic disease or a severe traumatic accident, and medication may be necessary in these cases. And while a disease process may be reversible, medication may be indicated to help mitigate further damage while healing takes place. Americans, though, are prescribed far more medications than in other countries, but without measurable benefit in key indicators, such as life expectancy, and this is because of the paradigm forced upon us by corporate and political influence.
The symptoms of disease are not the disease and focusing on treating the symptoms allows the disease process to progress while masking the outward manifestations. The paradigm of modern medicine fails because treatment starts too late and it focuses on the superficial manifestations of a disease rather than the underlying cause. Rates of chronic disease and their personal and societal costs will continue to increase unless we make significant changes in our approach to treatment and healing.
Paid subscribers have access to the full essays
The way medicine is practiced today has led to a sicker population. The use of medication as a crutch masks the symptoms of chronic disease, giving the appearance of healing, while the disease process continues unabated. The current paradigm of modern medicine enables unhealthy behavior rather than heals.
Obesity rates are the most obvious manifestation of declining population health. In the early 1970s, 15% of the population was obese. Today, it’s almost 50%, with 75% of adults being overweight. Obesity is just one sign of metabolic dysfunction – it just happens to be the most readily apparent. Rates of type 2 diabetes have quintupled over that same period, from 2% to over 10%. Healthcare expenditures in the US have gone from $74 billion in 1970 to $1.4 trillion in 2000 to over $4.3 trillion in 2021, and one out of every four of those dollars is spent on diabetes care. In children, rates of non-alcoholic fatty liver disease (NAFLD) are estimated to be between 5-10%. This was almost non-existent only a few decades ago and this is permanent liver damage that will stay with them their entire lives. In 2023, 29% of Americans report they have been diagnosed with depression. Diagnostic criteria have changed over the decades, but in 1970 the rate was around 5%. The largest increases have been in women and teens.
If so much more money is being spent on healthcare, why are disease rates so high? Is capital M Medicine (as an institution) really contributing to worsening health, or are they doing their best fighting an unwinnable battle against unstoppable social and lifestyle factors?
Obesity, type 2 diabetes, and NAFLD are all related to excess calorie intake. Most of the acquired chronic diseases with which we struggle are. Metabolic dysfunction may also contribute to depression and many other mental illnesses. The availability (it’s everywhere) and accessibility (it’s cheap) of high calorie foods is unprecedented. A longer discussion on this specific issue is warranted, but for now it should not be controversial to simply state this as a fact. Modern life also comes with myriad constant stressors aided and exacerbated by the overwhelming unavoidability of the news and social media. This is devastating to mental health.
Strategies to fight these new and everchanging foes are necessary. The question is, though, is Medicine fighting for us, or deceptively siding with the opposition?
The concept of a doctor and their role has varied throughout time and across cultures. From shamans in almost all indigenous cultures, curanderos (Latin America), sangomas (Africa), and hakims (Middle East and SE Asia), healers have always played an integral part in their communities. Attempts to understand the nature of illness has ranged from humors in ancient Greece (ambiguous bodily fluids whose abnormal flow results in sickness), qi in traditional Chinese medicine (even more ambiguous energy channels dependent on unimpeded flow), to simply ascribing ailments to evil spirits. Abundant useful knowledge has emerged from the trial-and-error process of using available plants and natural materials in search of various cures. Since chronic illness was rare in these communities, most of the healing efforts were focused on acute disease, and because many acute illnesses are self-limiting, less useful medical knowledge also persisted due to overvaluing anecdotal evidence.
“Doctors” in the West had much more inauspicious beginnings than the traditional village healers. Rather than being integral parts of the community, many were snake oil salesman promising miraculous cures only to be constantly on the move as their fraud became apparent.
The 1800s brought about germ theory and the establishment of medical schools. The idea that infections were caused by unseen organisms and could be limited by proper hygiene or appropriate antibiotics has been the single greatest leap forward in medicine and the biggest extender of human life. While positive advances were made towards actually helping people, medicine was still far from a respected profession. It was dirty, gruesome, and often did more harm than good. Lack of trust in the medical establishment persisted for decades, but over time physicians became the most trusted profession in the United States. Like the indigenous village healer, the quintessential nostalgic view of the doctor who hung up their shingle and started their own practice was an integral part of the local community.
In 1966, 75% of Americans had “great confidence in medical leaders.” Today that number is 34%. While that number surely took a large drop over the past few years with inconsistent and ineffective recommendations around the COVID pandemic, trust has been gradually eroding over decades as physicians have slowly but surely withdrawn from the community to the corporate world.
One of the first important but underappreciated steps in this process came from a landmark Supreme Court ruling, Goldfarb v. Virginia State Bar, in 1975. This ruling specifically targeted the practice of law, but was quickly applied to other professional organizations. The ruling reduced law, medicine, and other learned professions to “ordinary purveyors of commerce” and revoked much of their ability to self-regulate as independent professional societies. Simply put, the Federal Trade Commission would regulate medicine rather than an internal code of ethics.
Within five years the FTC sued the AMA over aspects of its ethical standards on the grounds of preventing free market competition. The suit alleged that two sections of the AMA’s Principles of Medical Ethics violated Sherman Antitrust laws, specifically the section that prohibited advertising, explicitly stating that the ban seriously inhibited the growth of HMOs (the horror!). The suit also attacked restrictions on physician’s contractual opportunities with corporations. The AMA had opposed such contracts on the ethical standards that fees for physician services should accrue to the physician and not a corporate entity, and that if a physician was employed by a corporation, they may find themselves under undue pressure to prioritize corporate goals over patient well-being.
The FTC won the lawsuit, breaking down the ethical walls surrounding medicine and allowing a flood of corporatism into an industry that previously prided themselves on maintaining the upmost respectability. Further regulatory burden and increased administrative costs have continued to pull doctors from the community into the corporate world, resulting in a climate where physicians are ultimately accountable to their company rather than their patients.
Around this same time, as healthcare was being invaded by corporate oligarchs, the US came off the gold standard backing its currency. A full examination of the effects of this move are both impossible and unnecessary here, but US dollars were no longer backed by gold and could thus be more easily created. As with any commodity, an increase in supply leads to a decrease in value. The supply of US dollars was then constrained only by an arbitrary debt ceiling set by Congress which eventually always gets raised, giving only the illusion of financial prudence. While dollars have maintained their strength relative to other countries’ currencies because of American economic prowess and strategic alliances, their purchasing power has plummeted compared to real goods. This is why a hamburger that used to cost 25 cents is now at least $5.00. Even with advances in technology that should result in cheaper costs of production, prices on essentially all goods continue to rise.
The most illustrative example of the dangers of debased currency is the propagation of endless wars. When going to war, countries or rulers would need to rely on the funds acquired from their respective citizens. The war could only last as long as money could be raised. Medieval wars mostly consisted of short raids to restock the coffers to fund the next raids. Even in World War I, American war bonds were issued and US citizens chose to invest to support what they saw as a worthy cause. But war is no longer dependent on the support of the respective populations. Rather than raise money from the populace, governments can simply issue debt to pay for needed expenses. This is money essentially created out of thin air and the reason why unfathomably expensive wars can continue indefinitely. For the first time in history, there is no constraint on war funding. Of course, this debt eventually comes due, so in order to pay it off, the government simply creates more money to pay it, devaluing the currency further.
War is extremely profitable for the defense contractors involved in supplying its necessities. If the price of human life is ignored, there is significant incentive to find endless battles to drive perpetual income streams. While not as absolutely horrific as war, in the same way the military industrial complex profits from human suffering, the medical industrial complex unquestionably profits from illness. Both hospital corporations and insurance companies benefit from increasing medical costs, and sick people generate much more revenue than healthy. Public companies (which includes most hospital systems and insurance providers) exist with the primary injunction to increase shareholder revenue. There is absolutely no question that corporate incentives are not aligned with the average person’s desire to live a long and healthy life. There is also no question that these same groups have infiltrated public health organizations and exert undue and insidious influence on policy. There is no coincidence that chronic disease, or illnesses requiring lifelong treatment and thus consistent revenue streams, also began skyrocketing in the 1970s. Healthcare costs as a percentage of GDP in the US have gone from 7.3% in 1970 to 19.7% in 2020. Endless disease, like endless war, is extremely profitable to small groups of individuals, while devastatingly detrimental to the population at large.
So where does this leave your local doctor? Are they fighting valiantly against the endless deluge of modern life that unavoidably leads to chronic disease? Or are they silently or unknowingly complicit in perpetuating a corrupt system that benefits from poor public health? Do they truly have your best interests at heart, or are they conflicted in their responsibilities to their corporate employer? Are they integrated into the community or an employee ID number in a faceless corporation? Are they doing their best within imposed constraints or burned out from misguided purpose? Are they a healer or an enabler?
Physicians operate under a number of constraints, and we will focus here on primary care physicians, who are the first line contact with the healthcare system.
First, most employment arrangements are fee-for-service, meaning the doctor gets paid for how much work they do. Seems reasonable, right? But it does create the incentive to see as many people as possible in a day, which results in shorter appointment times, less discussion, hastier diagnoses, and prescribing the easiest effective solution.
Second, physicians are paid for what they can bill. The patient may have a copay associated with their insurance that they pay prior to a medical visit, but the bulk of physician compensation comes from billing the insurance company after the visit. This is an arduous and complicated process and primary driver for why administrative costs to run a medical practice have skyrocketed. As a general rule, you work for whoever pays you. You are not paying your physician -- your insurance company is. This third-party payor also contributes to price obfuscation when it comes to medical services, but that is a topic for another day.
At the end of a visit (or during, if they’re trying to be especially efficient) the physician will input the various diagnoses into the medical record and the treatment plan, which will then be sent to the patient’s insurance company. Each diagnosis is associated with a code, and there must be a code to be billable. The current iteration of billable codes is ICD-10 and are assigned by Centers for Medicare & Medicaid Services (CMS). These codes are essentially all focused on treatment of existing disease rather than prevention. Insurance companies generally pay for one “preventative” visit a year, but the rest of billing is based on disease treatment. Again, this makes some sense, but alternatively incentivizes delayed monitoring or attention to early signs of burgeoning disease because addressing it before it is billable is uncompensated. In only paying for treatment of disease rather than prevention, insurance companies impose their priorities onto the doctor-patient relationship. No matter how good of a person your PCP is, they’re probably not going to expend too much energy for free.
Third, physicians are influenced by legal liability. Just like the other two constraints, this should generally be a good thing because you don’t want a careless doctor prescribing dangerous treatments or ignoring warning signs. But this also incentivizes “defensive” medicine, though, which results in ordering excessive tests and imaging, which quickly becomes expensive in order to not miss a potential diagnosis for which the physician could be sued later. Even with a miniscule chance of an obscure pathology, from a physician’s point of view, it’s advantageous to order the extra tests to minimize the chance of future legal issues. Hospitals and companies providing these services obviously love this for their bottom lines. Insurance companies generally benefit as well because they can push increased costs to their broad consumer base without causing too much of a fuss. Insurance companies will occasionally push back on testing and require a prior authorization, but as mentioned earlier, both healthcare institutions and insurance companies benefit from rising costs. (In bizarre reasoning, part of the reason insurance companies are ok with high prices is because, by law, insurance company executives can only be paid a percentage of their company’s incoming insurance premiums. Increasing medical costs can be passed on to consumers, which increases the overall incoming premiums, allowing these executives to pay themselves more.)
Legal liability also incentivizes doctors to prescribe medications more quickly. For example, if lab work reveals that a patient has high cholesterol and meets the recommendations for starting a statin, the doctor prescribes the medication and probably rechecks cholesterol levels in 3-6 months. In a situation like this, a risk calculator may show that the patient has, say, a 10% risk of severe cardiovascular event (ie heart attack or stroke) over the next ten years. The risk calculator also says that under ideal conditions, this risk could be decreased to 6% over those same 10 years. In their current condition, there’s about 1% chance per year of a serious event. (It’s actually slightly lower than 1% in the first year and slightly higher than 1% in the tenth year because increasing age brings increasing risk.) Ideal conditions would result in a 0.6% annual risk, or a 0.4% per year decrease in risk. This is a relatively small risk benefit, and once a patient is started on a medication like a statin, they are likely to be on it the rest of their lives. Statins do come with a range of side effects, and while they are generally less severe than a debilitating heart attack or stroke, it’s important to remember that no intervention is benign, or without risk. With so little benefit, it would seem reasonable that an alternative strategy would be to talk with the patient about possible lifestyle interventions and recheck cholesterol after implementation and see if risk can be reduced in that way, without introducing the added risk of a medication. The problem is, though, if that unlucky patient hits on that 1% per year risk and dies of a heart attack, the grieving family members could easily sue the negligent doctor that ignored medical society guidelines to start a statin medication. (The topic of misguided or absent risk/reward conversation will be in a future essay.)
To summarize, physicians are incentivized to see as many people as possible, only treat established disease and largely ignore preventative measures, treat said diseases with medication, and disregard costs. There is no incentive to keep a patient healthy. Generally being good people, most physicians try to reasonably balance these forced incentives with their true desire to help patients, but only have so much room to work within these set parameters.
This removal of physicians from the community to the corporate world, misaligned incentives, and imposed constraints all bring into question whose side the physician is truly on. It’s obvious that the goals of big healthcare and the general population do not align, and doctors are caught somewhere in the middle. But intentional or not, physicians have fallen into the trap of being enablers rather than healers.
Healing is a return to self-sustained equilibrium. Healing is a process, not a medication.
With an acute illness, such as a bacterial infection, medication can be used to help the body fight off the invaders and/or control symptoms while it does so on its own. These illnesses generally do not last very long, and the body quickly returns to its normal state of health. Healing is achieved.
Chronic illness (type 2 diabetes, fatty liver, COPD, depression, etc.) is a completely different animal. Where symptoms of a bacterial illness generally manifest within hours or days of the infection taking hold, chronic illness can build for years and decades before the first symptoms become apparent, and many of these symptoms are non-specific, or don’t necessarily point directly to the disease process causing them. Symptoms, though, are the body’s feedback mechanism, letting you know something is wrong. They are an internal organ’s cry for help.
When medications are prescribed for chronic diseases, they just interrupt this feedback mechanism and mute the distress signals. Medications don’t heal chronic disease. They can prevent progression of the disease, or they can mitigate damage while healing takes place, but they don’t heal the disease. They don’t help the body return to its state of self-sustained equilibrium. A medication may improve various measurable indicators, such as blood glucose levels or blood pressure, but that shouldn’t be considered healing. Sometimes so much damage has been done that healing isn’t possible, and medications are required to maintain a new steady state, but that is a far rarer case than we are led to believe. The danger with quickly prescribing a medication at the first sign of a disease, when it’s not actively threatening life or serious organ damage, is that it hides the effects of the disease and gives the illusion of healing, but allows the actual disease process to progress and gain a stronger foothold. It’s like pulling the leaves off a weed in your garden without getting to the roots. It may look like the weed is gone, but its roots are continuing to grow and become stronger.
The longer a person can maintain a self-sustained equilibrium, the greater their chances of living a long and active life. Once medications enter the picture, it is a slow decline from there. Of course, if a disease has become severe, declining medication treatment could lead to an earlier than necessary demise. In frustrating symmetry, in the same way that a chronic illness develops over years, healing often takes a similar amount of time. It’s not a quick resolution, so sometimes medications can help manage risk while lifestyle changes are implemented. Medication should always be started with the intent to stop them as soon as possible. While they do mask symptoms and in so doing diminish the urgency of making healthy changes, they are sometimes necessary.
Modern medicine has fallen into the trap of high time preference. Time preference is the value a person places on a reward received in the present vs receiving it in the future. Someone with a high time preference wants the reward now, while one with a low time preference is willing to wait.
The famous Stanford marshmallow experiment provides a good illustration of this concept. A child was placed in a room with a marshmallow. The child was then told that if they waited to eat the marshmallow until the researcher came back (after about 15 minutes), they would get a second marshmallow. The children who ate the marshmallow would be described to have a high time preference, while those who waited would have a low time preference. One of the conclusions from this study discovered decades after the initial experiment was that low time preference correlated with success and achievement later in life. Specifically, children that waited tended to have higher SAT score, lower levels of substance abuse, better social skills, and better responses to stress. Another heuristic is a “spender” is high time preference and a “saver” is low time preference.
Quickly prescribing medication once a chronic disease is diagnosed provides the immediate satisfaction of improved lab results or symptom relief. (Again, sometimes symptoms are severe enough that medication is necessary to mitigate harmful side effects.) This isn’t healing the disease, though, and leads to persistence of the underlying process. A moderately high blood sugar doesn’t immediately need metformin. A statin medication isn’t needed right away for elevated cholesterol. SSRIs aren’t urgently necessary for depression, and usually take weeks to even become effective in improving mood.
High time preference medicine involves prescribing a medication for the immediate reward of improved metrics rather than investing the time and energy necessary to figure out and fix the actual problem. This is what pharmaceutical corporations want. They want us to buy into the charade that improved symptoms is the same as healing. With any intervention there is a risk/benefit calculation to be made. It’s naturally easier to see the possible benefits of intervention and ignore the possible risks. The most underappreciated risk is that treating the symptoms of disease just leads to continued progression of the disease. Once a medication is started, it is far more likely that a patient will eventually need either increased doses of the medication or additional medications rather than eventually needing less or none. The disease process that caused the symptom is marching onward, even though the symptom, the bodily organ’s cry for help, was muted.
Medications enable unhealthy lifestyles. When we think of “enabling”, we generally think of dangerous addictive substances such alcohol or drugs. The “enabler” implicitly supports the addict’s behavior by ignoring or covering up for their mistakes and preventing the addict from facing the consequences of their actions. When a bodily organ starts to get overworked, signs and symptoms start to show up. Some are subtle and are only detected on blood work. Others, such as an expanding waistline or constant thirst and urination in the early development of diabetes, are less subtle. Medications often blunt and distort these feedback mechanisms. Disease symptoms are the body’s way of telling you something is wrong and using medications to suppress symptoms enables continued unhealthy lifestyle. The dangers of enabling something like unhealthy eating habits isn’t as acutely dangerous as enabling a heroin addiction, but it does encourage a person to continue down a dangerous pathway.
The thinking is often that the risk of an abnormal finding on lab work (or by some other measure) needs to be mitigated immediately. Without fixing the underlying problem though, this just pushes risk down the road and allows it to increase. It is often better to tolerate some risk in the short term while working towards healing, rather than immediately intervene, but encourage the disease process to grow and result in likely greater damage in the future. The only way to truly eliminate risk is to eliminate the disease process.
What, then, is the doctor’s role? How does a doctor return to the role of a healer? The current amount of people suffering from chronic disease is unprecedented. Doctors of old didn’t have to deal with this. Because we are facing an unprecedented challenge, this requires a complete re-imagining of a doctor’s role. The current system obviously isn’t working. There are myriad outside forces pushing increased chronic disease and it’s essentially impossible to eliminate the forces. As discussed above, there’s also multiple constraints within which a doctor must work.
First, a doctor must resist the urge to practice high time preference medicine and prescribe medications too quickly. A physician’s role is to educate, encourage, and help their patients reach their individual health goals. It can include warning of the dangers of unhealthy lifestyles that are becoming more and more accepted. (Of course a company that manufactures fattening foods wants to normalize obesity.) A doctor needs to be able to cut through the corporate and political interventions and give their patient clear and concise medical advice. This requires intensive critical thinking skills on the part of the doctor, because even generally accepted medical society guidelines are not free from outside subterfuge. In the same way investors may have different risk appetites when it comes to their investment portfolios, patients will have different risk preferences when it comes to their health. A patient may be ok with the 10% decade risk of a heart attack compared with starting a medication, while another cannot tolerate any increased risk that could be mitigated with a statin. Or it may just be that the patient can’t make the appropriate changes and openly uses a medication as a crutch to support their desired lifestyle. But again, the high time preference paradigm of modern medicine underemphasizes intervention risks, overstates their benefits, and subtly imbeds this paradigm into doctors from the very first day of medical school. Doctors are to follow society guidelines and any deviation leads to quackery.
Most important is recognizing that a doctor doesn’t actually heal anyone of chronic disease. Prescribing a medication is enabling, not healing. Healing comes through lifestyle changes, and these must be made by the patient. More often than not, these lifestyle changes involve removing the offending action or substance. It’s the doctor’s job to understand and point out what needs to be removed and develop individual strategies to work towards that goal. The practice of medicine which involves the removal of dangerous influences, or medicina via negativa, is the only way to truly follow the Hippocratic Oath. Any additive intervention will have some harmful consequences. To be fair, the Hippocratic Oath also purports free medical education and decries abortion, so we gave up on it long ago, but we still barely even pay lip service to its maximally condensed form of “Do No Harm.”
A patient must ultimately heal themselves. The new paradigm through which a doctor should be viewed is as a guide and confidant along the pathway to healing. This sounds mushy, but it’s important to recognize that a doctor’s role in chronic disease is to educate and encourage, and burdening the physician with the responsibility to heal is misinformed and can lead to dangerous over-intervention.
Many people are fighting uphill battles against unfair genetic pre-dispositions to disease, but the result isn’t foreordained. The cure is a change in lifestyle. Regardless of the malady, healing happens from the inside out. Healing can only take place after one accepts responsibility and resolves to change. The responsibility can’t be outsourced onto someone else. It doesn’t have to be done alone or without help, but there is not a single medication that can heal someone against their will. Putting the burden of responsibility on the patient is often considered naïve or hopeless, and maybe it is, but it is the only way to fix progressively worsening public health.
In classical literature, heroes were more commonly praised for their efforts rather than their outcomes. Heroes could, and often did, lose, but the glory was in the fight. Over time, the view of heroism gradually shifted to results, rather than process. The hero was the one who won. This view is harmful and encourages high time preference thinking and valuing short term outcomes over long term ones. Modern medicine focuses too often on achieving short term results at the expense of investing in long term health. Healing is a process. We already know we’re in a losing battle. We will all undoubtedly die. The longer we can keep our bodies close to the self-sustained equilibrium, though, the more prepared we are to deflect the various maladies that come our way. The glory and heroism, then, is in the fight. It’s a Quixotic endeavor full of unstoppable forces and immovable objects, but it is a fight worth fighting, and it doesn’t need to be fought alone. A doctor must resist the constraints of the medical system and become the helpful squire supporting the knight in their fight against the ravages of modernity that have sent so many down a road of ill health and unfulfilled potential.