There has been an ongoing evolution in the treatment of psychosis, schizophrenia, bipolar, and other significant psychiatric morbidities. As therapies have evolved, efficacy in treating positive and negative symptoms has (for the most part) remained quite high, and there has been a concerted push to — as the generations of drugs progress ever-forward — reduce the off-target and unwanted side effects of treatment (which also helps to boost compliance with medication-taking behaviors). Dr Ben Locwin shares his expertise on the future of pharmacologic treatment.

It’s Been a Monumental Six Decades

There is a long and storied history of antipsychotics with which we’re all familiar. Punctuating the overall story are instances of improvement and regression. The initial uses of chlorpromazine (the first antipsychotic) clinically were met with exuberant psychiatric care notes, with passages such as, “I couldn’t believe the improvement we all witnessed.”


Continue Reading

When Joseph Hammon’s work using chlorpromazine on a manic patient was extended to larger trials and eventual approval, it was named “Largactil,” because of its ”large actions” across many different symptoms (and therefore, receptor sites). These include antihistaminic properties, adrenolytic, gangliolytic, and antiemetic.

First-generation antipsychotics are D2 antagonists (among other sites mentioned above), and so they work primarily and nonselectively across many regions of the brain including the mesocortical, mesolimbic, nigrostriatal, and tuberoinfundibular. Because of the broad effects of the nonselective D2 blockade, first-generation antipsychotics (FGAs) tend to produce a host of unpleasant side (and potentially dangerous) effects like neuroleptic malignant syndrome, tardive dyskinesia, extrapyramidal symptoms, and other neurological effects. So, a complicated balance has existed of revolutionary treatment potential, and life-altering adverse events. The 1950s were the start of the “psychopharmacologic revolution.”

Ben Locwin, PhD, MS, MBA

In 1958, several tricyclic compounds were synthesized in the lab based from the antidepressant imipramine. The first to be aggressively trialed with positive effects was clozapine. There is a great and rich history of psychopharmacology between the 1970s and the 1990s, wherein the groundwork was laid for the second-generation antipsychotics (SGAs), unveiled with aripiprazole.

Also known as “atypical antipsychotics,” these became widely used because of their demonstrated efficacy and relative lack of adverse events (compared with FGAs, especially extrapyramidal side effects). The major mechanistic difference between first- and second-generation antipsychotics is the preferential docking and receptor binding with D2 and D3 (partial agonist activity), with very few muscarinic and adrenergic alpha-1, and histamine-1 effects. SGAs also tend to blockade 5HT2A receptors.

Third (tertiary)-generation antipsychotics themselves are further differentiated by their downstream receptor agonistic and antagonistic effects. By preferential binding to D3 rather than D2 (alone or in combination with D3), along with partial agonism of D2 receptors, specific pharmacologic effects can be elicited. Remember, there are two subfamilies of dopamine receptors, which were themselves reclassifications of previous naming conventions: The so called D1-like receptors (D1 and D5) and the D2-like receptors (D2, D3, and D4). D3 receptors are predominantly located in areas associated with some psychotic symptoms such as the ventral striatum, nucleus accumbens, thalamus, hippocampus, and the cortex.

In addition, the D3 receptors are interesting targets for their unexpected distribution within the brain and relatively lower abundance (and higher affinity for dopamine). It may be that they produce fewer motor effects than the sites targeted by other antipsychotics. These are likely from where the effects selectively noted stem with the use of newer therapies like aripiprazole (Abilify), cariprazine (Vraylar), brexpiprazole (Rexulti), and lumateperone (Caplyta), (among others in-use and still in clinical trials for evaluation).

(More) Modern Therapies

The most common adverse events I have seen with many of these tertiary-generation antipsychotics, either clinically (anecdotes) or in trials, have been akathisia, insomnia (and somnolescence), dry mouth, weight gain (and loss), and of course the other effects noted in the trials found in the prescriber’s information sheets.

There have been many misconceptions about second- and tertiary-generation antipsychotics in the media and among clinicians, principal among them: That SGAs and TGAs somehow represent a “more significant” or “more dire” pharmacologic treatment than others such as SSRIs or NDRIs. This is entirely untrue. Many other psychiatric treatments represent a broader and deeper pantheon of side effects, and some more severe. But that this opinion of this drug class exists is largely rooted in chemophobia; Pharmaceuticals, and any chemicals really, are devoid of agency or motive in and of themselves. It’s how they’re used (prescribed, dosed, and evaluated) that gives them an effect, either positive or negative. The threshold for the effects, the width of the therapeutic windows, side effects, and other nuances are the realm of clinical practice and toxicology. While I think to some extent, those in the psychiatric realm are keenly aware of this and sensitive to it, I also believe that it bears repeating.

Also, I should say that “more modern” isn’t synonymous with “more effective” or “better” (necessarily). It tends to mean more targeted receptor effects, and/or fewer resultant unwanted side effects. This can certainly make them “better,” but again, chemistry is agnostic to our value judgments. Though similar positive symptom manifestation has been reported with second- and tertiary-generation therapies, the jury’s still out on their comparative reduction in negative symptoms relative to FGAs. As more TGAs enter and complete clinical trials (and we get more-and-more post-market data in the form of clinical responses and anecdotes), we’ll begin to get a better picture of alternative receptor binding regimes and overall population effects.

This is also why psychiatrists and other prescribers have been availing themselves more and more of the genetic analysis of patients, to gather insight into their metabolic characteristics (cytochromes, etc.). These tests can suggest who are “faster” metabolizers of some molecules, or for whom certain other drugs may present problems. But I would caution that many of these tests aren’t “ready for primetime” for the uses in which they’re being applied. For this, I would use the analogy we use in biostatistics, when for example, we analyze clinical trial results: The analytical tests “should be used as a drunk uses a lamppost — For support, not for illumination.” As they get better, and our understanding of pharmacogenomics improves (the relationship between an individual’s genetic composition and the pharmaceutical treatment), we’ll be able to use future data of this nature to be more illuminative than it currently is.

The Takeaway

Always rely on patient feedback within clinical decision-making; Therapies are necessarily designed to improve how the patient feels, functions, or survives. Too often, we may be seduced into using newer technologies or buzz worthy tests because of their apparent objective quantitative value (even if spurious!), while simultaneously de-valuing the actual in vivo clinical feedback from the very patients within whom they’re given to evaluate.