Nanotech drugs reach FDA with all the obscurity

news · 7 years ago
by Krisztián Niesz

When I was browsing through Chemical & Engineering News recently an article about how the Food & Drug Administration (FDA) struggles to come up with appropriate processes to review new nanotechnology-based drug applications caught my attention, and had me thinking about how/if cheminformatics can be called to help here. According to this discussion “Mapping Nanotech Drug’s Landscape” (requires subscription to read) the FDA sees the main cause of the problem as having inconsistent characterization data from drug manufacturers or even having no data available at all.

Nanomaterials in medical chemistry

Nanotechnology itself has already passed the state of being only a buzzword, and by now has invaded many technological areas providing great benefits for society. Although it is a difficult, possibly overwhelming task to get a track on each commercially used “nano” product, it is believed that there are already 1000’s distributed worldwide, mainly in areas of the electronic, cosmetic, automotive and medical industry and, without doubt, more are yet to come. There is also a recognizable trend that nanomaterials are not just playing a passive role in recently reported applications, which translated into medicinal chemistry would mean acting as a drug carrier, rather they are designed to act as active structures in “smart drugs”. The FDA claims that the organization has received more than 150 new drug applications (158 to be accurate) involving nanomaterials to date. However, in not all cases would these applications contain consistent data necessary for thoroughly reviewing them. It is reported that in more than half of the cases, although the size of the nanoparticles were published, no information was available about the measurement technique. How can this be true? It is like playing in a U21 football team without showing a valid birth certificate. The ever-young international Cameroon defender Tobie Mimboe could tell us about it.

As a former nanotechnologist myself I can see the difficulty manufacturers are facing when it comes to analyzing objects on the atomic scale. For example how to measure the “real” size of nanoparticles has always been a subject of debate in the scientific community. Can I really trust electron microscopy, with which one can create nice images, but with only a fraction of the particles that are anchored on the supporting grid? What about other techniques, such as light scattering, X-ray and laser diffraction, surface area measurements? Which values are considered to be the most precise, therefore the one that determines size dependent physicochemical properties, such as solubility, absorption, bio-distribution, etc.? The best case scenario, of course, would be if all these techniques would lead to exactly the same number (or at least similar), however, this is an illusion because of the different physics and theories behind the measurement methods. Another issue that requires attention is the size distribution featuring the nanomaterial. In a lot of cases researchers can’t do better than a few percent of standard deviation regardless of the approach being used (e.g. bottom-up or top-down). However, that few percent might be just a little too much, and the broad size distribution could cause heterogeneous behavior in solubility, for instance. Several questions like those above could be raised, and we are just scratching the surface. Aggregation of nanoparticles may also cause severe problems. Ironically in this aspect smaller is worse due to the fact that smaller particles show higher surface energies, therefore they tend to aggregate to a greater extent.

Safety issues related to nanotech drugs deserve to be mentioned in a separate paragraph. Although it is one of the most discussed scientific topics in the field with several research groups making progress (e.g. Pompa et al., Nanoscale, 2011, 3, 2889. also required subscription to read) nanoparticle toxicology in living systems remains unsolved, and we are still far away from predicting how safe a new nanomaterial would be and potentially make it safer.

Overall, it was recognized by the FDA that the current reviewing process is not mature enough to deal with the growing number of nanotech-based drug applications, and this is especially true for materials that are created via reformulations after passing Phase III clinical trials. New analytical methods e.g. dissolution methods have to be developed as well as the current safety evaluation and risk analysis methods have to be improved.

Back to the original question I raised above, how can cheminformatics help in this difficult matter? Predicting physicochemical properties of nanomaterials, such as the ones in ADMETox can certainly be one area to take advantage of, with the size taken into account. This, though, requires better trained software, hence deeper and better-organized collaborations between software companies and Academia (who supplies the actual data for training tools). However, first of all it has to be decided what information we are looking for to harvest and use when it comes to nanotech drugs. Once we know, chemo- and bioinformatics tools able to create a standardized database containing intelligently registered nanomaterials with filtered/calculated chemical information could be developed. Obviously this won’t help making nanotech drugs with better precision. Fortunately though, that is not our task. Our job is to support the professional synthetic chemists and material scientists in the laboratory. It is good to know that steps are being taken into this direction too. For instance the National Cancer Institute has founded the Alliance for Nanotechnology in Cancer hoping for breakthroughs in cancer research (the most active therapeutic sector related to nanotech drug applications). If I may put my two cents in, I'd expect to see a dramatic increase in the number (158) of nanotechnology-based medicinal applications in the near future.

[carousel_and_lightbox]