- cross-posted to:
- technews@radiation.party
- science
- Neuron ( @racer983@mander.xyz ) 3•1 year ago
I agree with this. The area of medicine I’m most involved in has had a crazy rate of new medications approved and innovations with a giant pipeline of possibilities on the way. The article focuses on crispr a lot, which is cool and always get the headlines, but I think in the nearer term oligonucleotide therapies or even viral vector gene therapies are already here. Oligonucleotide therapies use rna to affect gene expression, usually decreasing it. Theoretically it can be used in any toxic gain of function mutation, which covers a lot of genetic diseases. It’s not really a question of do we have the ability to treat genetic diseases anymore, it’s more getting all the time, money, expertise, and prerequisite natural history work done on the sheer number of them so these tolls can be tested in all these diseases and brought to patients.
The importance of high quality natural history studies and biomarker development cannot be overstated too. When you design a clinical trial you need to know how many patients need to be in it and how long it needs to run, or else you might accidentally throw out a treatment that works by designing the trial incorrectly. Natural history studies are where you get that information. Biomarkers can help provide more sensitive measures of change so you can more quickly figure out if a treatment has potential or not (ideally followed up by proving efficacy with clinical measures too).
The availablity of the these tools for making new treatments but limited resources for testing them is also leading to ethical issues and inequality. For instance there have been a number of “N of 1” trials where treatments were made specifically for a particular patient. I hope that benefit would then flow to other patients eventually too, but it does raise a lot of questions.