Redefining Computing with Quantum Advantage

First published 2024

This CreateAnEssay4U special edition brings together the work of previous essays and provides a comprehensive overview of an important technological area of study. For source information, see also:

https://createanessay4u.wordpress.com/tag/quantum/

https://createanessay4u.wordpress.com/tag/computing/

In the constantly changing world of computational science, principles of quantum mechanics are shaping a new frontier, set to transform the foundation of problem-solving and data processing. This emerging frontier is characterised by a search for quantum advantage – a pivotal moment in computing, where quantum computers surpass classical ones in specific tasks. Far from being just a theoretical goal, this concept is a motivating force for the work of physicists, computer scientists, and engineers, aiming to unveil capabilities previously unattainable.

Central to this paradigm shift is the quantum bit or qubit. Unlike classical bits restricted to 0 or 1, qubits operate in a realm of quantum superposition, embodying both states simultaneously. This capability drastically expands computational potential. For example, Google’s quantum computer, Sycamore, used qubits to perform calculations that would be impractical for classical computers, illustrating the profound implications of quantum superposition in computational tasks.

The power of quantum computing stems from the complex interaction of superposition, interference, and entanglement. Interference, similar to the merging of physical waves, manipulates qubits to emphasise correct solutions and suppress incorrect ones. This process is central to quantum algorithms, which, though challenging to develop, harness interference patterns to solve complex problems. An example of this is IBM’s quantum computer, which uses interference to perform complex molecular simulations, a task far beyond the reach of classical computers.

Entanglement in quantum computing creates a unique correlation between qubits, where the state of one qubit is intrinsically tied to another, irrespective of distance. This “spooky action at a distance” allows for a collective computational behavior surpassing classical computing. Quantum entanglement was notably demonstrated in the University of Maryland’s quantum computer, which used entangled qubits to execute algorithms faster than classical computers could.

Quantum computing’s applications are vast. In cryptography, quantum computers can potentially break current encryption algorithms. For instance, quantum algorithms developed at MIT have shown the ability to crack encryption methods that would otherwise be secure against classical computational attacks. This has spurred the development of quantum-resistant algorithms in post-quantum cryptography.

Quantum simulation, a key application of quantum computing, was envisioned by physicist Richard Feynman and is now close to reality. Quantum computers, like those developed at Harvard University, use quantum simulation to model complex molecular structures, significantly impacting drug discovery and material science.

Quantum sensing, an application of quantum information technology, leverages quantum properties for precise measurements. A prototype quantum sensor developed by MIT researchers, capable of detecting various electromagnetic frequencies, exemplifies the advanced capabilities of quantum sensing in fields like medical imaging and environmental monitoring.

The concept of a quantum internet interconnecting quantum computers through secure protocols is another promising application. The University of Chicago’s recent experiments with quantum key distribution demonstrate how quantum cryptography can secure communications against even quantum computational attacks.

Despite these applications, quantum computing faces challenges, particularly in hardware and software development. Quantum computers are prone to decoherence, where qubits lose their quantum properties. Addressing this, researchers at Stanford University have developed techniques to prolong qubit coherence, a crucial step towards practical quantum computing.

The quantum computing landscape is rich with participation from startups and established players like Google and IBM, and bolstered by government investments. These collaborations accelerate advancements, as seen in the development of quantum error correction techniques at the University of California, Berkeley, enhancing the stability and reliability of quantum computations.

Early demonstrations of quantum advantage have been seen in specialised applications. Google’s achievement in using quantum computers for complex tasks like random number generation is an example. However, the threat of a “quantum winter,” a period of reduced interest and investment, looms if practical applications don’t quickly materialise.

In conclusion, quantum advantage represents a turning point in computing, propelled by quantum mechanics. Its journey is complex, with immense potential for reshaping various fields. As this field evolves, it promises to tackle complex problems, from cryptography to material science, marking a transformative phase in technological advancement.

This CreateAnEssay4U special edition brings together the work of previous essays and provides a comprehensive overview of an important technological area of study. For source information, see also:

https://createanessay4u.wordpress.com/tag/quantum/

https://createanessay4u.wordpress.com/tag/computing/

Links

https://www.nature.com/articles/s41586-022-04940-6

https://www.quantumcomputinginc.com/blog/quantum-advantage/

https://www.ft.com/content/e70fa0ce-d792-4bc2-b535-e29969098dc5

https://semiengineering.com/the-race-toward-quantum-advantage/

https://www.cambridge.org/gb/universitypress/subjects/physics/quantum-physics-quantum-information-and-quantum-computation/

The Implications of Artificial Intelligence Integration within the NHS

First published 2023

This CreateAnEssay4U special edition brings together the work of previous essays and provides a comprehensive overview of an important technological area of study. For source information, see also:

https://createanessay4u.wordpress.com/tag/ai/

https://createanessay4u.wordpress.com/tag/nhs/

The advent and subsequent proliferation of Artificial Intelligence (AI) have ushered in an era of profound transformation across various sectors. Notably, within the domain of healthcare, and more specifically within the context of the United Kingdom’s National Health Service (NHS), AI’s incorporation has engendered a myriad of both unparalleled opportunities and formidable challenges. From an academic perspective, there is a burgeoning consensus that AI might be poised to rank among the most salient and transformative developments in the annals of human progression. It is neither hyperbole nor mere conjecture to assert that the innovations stemming from AI hold the potential to redefine the contours of our societal paradigms. In the ensuing discourse, we shall embark on a rigorous exploration of the multifaceted impacts of AI within the NHS, striving to delineate the promise it holds while concurrently interrogating the potential pitfalls and challenges intrinsic to such profound technological integration.

Medical Imaging and Diagnostic Services play a pivotal role in the modern healthcare landscape, and the integration of AI within this domain has brought forth noteworthy advancements. AI’s robust capabilities for image analysis have not only enhanced the precision in diagnostics but also broadened the scope of early detection across a variety of diseases. Radiology professionals, for instance, increasingly leverage these advanced tools to identify diseases at early stages and thereby minimise diagnostic errors. Echocardiography charts, used to gauge heart patterns and detect conditions such as ischemic heart disease, are another beneficiary of AI’s analytical prowess. An example of this is the Ultromics platform from a hospital in Oxford, which employs AI to meticulously analyse echocardiography scans.

Moreover, the application of AI in diagnostics transcends cardiological needs. From detecting skin and breast cancer, eye diseases, pneumonia, to even predicting psychotic occurrences, AI’s potential in medical diagnostics is vast and promising. Neurological conditions like Parkinson’s disease can be identified through AI tools that examine speech patterns, predicting its onset and progression. In the realm of endocrinology, a study used machine learning models to foretell the onset of diabetes, revealing that a two-class augmented decision tree was most effective in predicting diabetes-associated variables.

Furthermore, the global threat of COVID-19 in 2019 also saw AI playing a crucial role in early detection and diagnosis. Numerous medical imaging tools, encompassing X-rays, CT scans, and ultrasounds, employed AI techniques to assist in the timely diagnosis of the virus. Recent studies have spotlighted AI’s efficacy in differentiating COVID-19 from other conditions like pneumonia using imaging modalities like CT scans and X-rays. The surge in AI-based diagnostic tools, such as the deep learning model known as the transformer, facilitates efficient management of COVID-19 cases by offering rapid and precise analyses. Notably, the ImageNet-pretrained vision transformer was used to identify COVID-19 cases using chest X-ray images, showcasing the adaptability and precision of AI in response to pressing global health challenges.

Moreover, advancements in AI aren’t limited to diagnostic models alone. The field has seen the emergence of tools like Generative Adversarial Networks (GANs), which have considerably influenced radiological practices. Comprising a generator that produces images mirroring real ones, and a discriminator that differentiates between the two, GANs have the potential to redefine radiological operations. Such networks can replicate training images and create new ones with the training dataset’s characteristics. This technological advancement has not only aided in tasks like abnormal detection and image synthesis but has also posed challenges even for experienced radiologists, as discerning between GAN-generated and real images becomes increasingly intricate.

Education and research also stand to benefit immensely from such advancements. GANs have the potential to swiftly generate training material and simulations, addressing gaps in student understanding. As an example, if students struggle to differentiate between specific medical conditions in radiographs, GANs could produce relevant samples for clearer understanding. Additionally, GANs’ capacity to model placebo groups based on historical data can revolutionise clinical trials by minimising costs and broadening the scope of treatment arms.

Furthermore, the role of AI in offering virtual patient care cannot be overstated. In a time where in-person visits to medical facilities posed risks, AI-powered tools bridged the gap by facilitating remote consultations and care. Moreover, the management of electronic health records has been vastly streamlined due to AI, reducing the administrative workload of healthcare professionals. It’s also reshaping the dynamics of patient engagement, ensuring they adhere to their treatment plans more effectively.

The impact of AI on healthcare has transcended beyond diagnostics, imaging, and patient care, making significant inroads into drug discovery and development. AI-driven technologies, drawing upon machine learning, bioinformatics, and cheminformatics, are revolutionising the realm of pharmacology and therapeutics. With the increasing challenges and sky-high costs associated with drug discovery, these technologies streamline the processes and drastically reduce the time and financial investments required. Historical precedents, like the AI-based robot scientist named Eve, stand as a testament to this potential. Eve not only accelerated the drug development process but also ensured its cost-effectiveness.

AI’s capabilities are not just confined to the initial phase of scouting potential molecules in the field of drug discovery. There’s a promise that AI could engage more dynamically throughout the drug discovery continuum in the near future. The numerous AI-aided drug discovery successes in the literature are a testament to this potential. A notable instance is the work by Toronto-based firm, deep genomics. Harnessing the power of an AI workbench platform, they identified a novel genetic target and consequently developed the drug candidate DG12P1, aimed at treating a rare genetic variant of Wilsons’ disease.

One of the crucial aspects of drug development lies in identifying novel drug targets, as this could pave the way for pioneering first-in-class clinical drugs. AI proves indispensable here. It not only helps in spotting potential hit and lead compounds but also facilitates rapid validation of drug targets and the subsequent refinement in drug structure design. Another noteworthy application of AI in drug development is its ability to predict potential interactions between drugs and their targets. This capability is invaluable for drug repurposing, enabling existing drugs to swiftly progress to subsequent phases of clinical trials.

Moreover, with the data-intensive nature of pharmacological research, AI tools can be harnessed to sift through massive repositories of scientific literature, including patents and research publications. By doing so, these tools can identify novel drug targets and generate innovative therapeutic concepts. For effective drug development, models can be trained on extensive volumes of scientific data, ensuring that the ensuing predictions or recommendations are rooted in comprehensive research.

Furthermore, AI’s applications aren’t just limited to drug discovery and design. It’s making tangible contributions in drug screening as well. Numerous algorithms, such as extreme learning machines, deep neural networks (DNNs), random forests (RF), support vector machines (SVMs), and nearest-neighbour classifiers, are now at the forefront of virtual screening. These are employed based on their synthesis viability and their capacity to predict in vivo toxicity and activity, thereby ensuring that potential drug candidates are both effective and safe.

The proliferation of AI in various sectors has brought along with it a range of ethical and social concerns that intersect with broader questions about technology, data usage, and automation. Central among these concerns is the question of accountability. As AI systems become more integrated into decision-making processes, especially in sensitive areas like healthcare, who is held accountable when things go wrong? The possibility of AI systems making flawed decisions, often due to intrinsic biases in the datasets they are trained on, can lead to catastrophic outcomes. An illustration of such a flaw was observed in an AI application that misjudged pneumonia-related complications and potentially jeopardised patients’ health. These erroneous decisions, often opaque in nature due to the intricate inner workings of machine learning algorithms, further fuel concerns about transparency and accountability.

Transparency, or the lack thereof, in AI systems poses its own set of challenges. As machine learning models continually refine and recalibrate their parameters, understanding their decision-making process becomes elusive. This obfuscation often referred to as the ‘black-box’ phenomenon, hampers trust and understanding. The branch of AI research known as “Explainable Artificial Intelligence (XAI)” attempts to remedy this by making the decision-making processes of AI models understandable to humans. Through XAI, healthcare professionals and patients can glean insights into the rationale behind diagnostic decisions made by AI systems. Furthermore, this enhances the trust quotient, as evidenced by studies that underscore the importance of visual feedback in fostering trust in AI models.

Another prominent concern is the potential reinforcement of existing societal biases. AI systems, trained on historically accumulated data, can inadvertently perpetuate and even amplify biases present in the data, leading to skewed and unjust outcomes. This is particularly alarming in healthcare, where decisions can be a matter of life and death. This threat is further compounded by data privacy and security issues. AI systems that process sensitive patient information become prime targets for cyberattacks, risking unauthorised access or tampering of data, with motives ranging from financial gain to malicious intent.

The rapid integration of AI technologies in healthcare underscores the need for robust governance. Proper governance structures ensure that regulatory, ethical, and trust-related challenges are proactively addressed, thereby fostering confidence and optimising health outcomes. On an international level, regulatory measures are being established to guide the application of AI in domains requiring stringent oversight, such as healthcare. The European Union, for instance, introduced the GDPR in 2018, setting forth data protection standards. More recently, the European Commission proposed the Artificial Intelligence Act (AIA), a regulatory framework designed to ensure the responsible adoption of AI technologies, mandating rigorous assessments for high-risk AI systems.

From a technical standpoint, there are further substantial challenges to surmount. For AI to be practically beneficial in healthcare settings, it needs to be user-friendly for healthcare professionals (HCPs). The technical intricacies involved in setting up and maintaining AI infrastructure, along with concerns of data storage and validity, often act as deterrents. AI models, while potent, are not infallible. They can manifest shortcomings, such as biases or a susceptibility to being easily misled. It is, therefore, imperative for healthcare providers to strategise effectively for the seamless implementation of AI systems, addressing costs, infrastructure needs, and training requirements for HCPs.

The perceived opaqueness of AI-driven clinical decision support systems often makes HCPs sceptical. This, combined with concerns about the potential risks associated with AI, acts as a barrier to its widespread adoption. It is thus imperative to emphasise solutions like XAI to bolster trust and overcome the hesitancy surrounding AI adoption. Furthermore, integrating AI training into medical curricula can go a long way in ensuring its safe and informed usage in the future. Addressing these challenges head-on, in tandem with fostering a collaborative environment involving all stakeholders, will be pivotal for the responsible and effective proliferation of AI in healthcare. Recent events, such as the COVID-19 pandemic and its global implications alongside the Ukraine war, underline the pressing need for transformative technologies like AI, especially when health systems are stretched thin.

Given these advancements, it is pivotal however to scrutinise the sources of this information. Although formal conflicts of interest should be declared in publications, authors may have subconscious biases, for and against, the implementation of AI in healthcare, which may influence the authors’ interpretations of the data. Discussions are inevitable regarding published research, particularly since the concept of ‘false positive findings’ came to the forefront in 2005 in a review by John Ioannidis (“Why Most Published Research Findings Are False”). The observation that journals are biased in publishing more papers that have positive rather than negative findings both skews the total body of the evidence and underscores the need for studies to be accurate, representative, and negligibly biased. When dealing with AI, where the risks are substantial, relying solely on justifiable scientific evidence becomes imperative. Studies that are used for the implementation of AI systems should be well mediated by a neutral and independent third party to ensure that any advancements in AI system implementations are based solely on justified scientific evidence, and not on personal opinions, commercial interests or political views.

The evidence reviewed undeniably points to the potential of AI in healthcare. There is no doubt that there is real benefit in a wide range of areas. AI can enable services to be run more efficiently, allow selection of patients who are most likely to benefit from a treatment, boost the development of drugs, and accurately recognise, diagnose, and treat diseases and conditions.

However, with these advancements come challenges. We identified some key areas of risk: the creation of good quality big data and the importance of consent; the data risks such as bias and poor data quality; the issue of a black box (lack of transparency of algorithms); data poisoning; and data security. Workforce issues were also identified: how AI works with the current workforce and the fear of workforce replacement; the risk of de-skilling; and the need for education and training, and embedding change. It was also identified that there is a current need for research into use, cost-effectiveness, and long-term outcomes of AI systems. There will always be a risk of bias, error chance statistical improbabilities, in research and published studies fundamentally due to the nature of science itself. Yet, the aim is to have a body of evidence that helps create a consensus of opinion.

In summary, the transformative power of AI in the healthcare sector is unequivocal, offering advancements that have the potential to reshape patient care, diagnostics, drug development, and a myriad of other domains. These innovations, while promising, come hand in hand with significant ethical, social, and technical challenges that require careful navigation. The dual-edged sword of AI’s potential brings to light the importance of transparency, ethical considerations, and robust governance in its application. Equally paramount is the need for rigorous scientific evaluation, with an emphasis on neutrality and comprehensive evidence to ensure AI’s benefits are realised without compromising patient safety and care quality. As the healthcare landscape continues to evolve, it becomes imperative for stakeholders to strike a balance between leveraging AI’s revolutionary capabilities and addressing its inherent challenges, all while placing the well-being of patients at the forefront.

This CreateAnEssay4U special edition brings together the work of previous essays and provides a comprehensive overview of an important technological area of study. For source information, see also:

https://createanessay4u.wordpress.com/tag/ai/

https://createanessay4u.wordpress.com/tag/nhs/

Links

https://www.gs1ca.org/documents/digital_health-affht.pdf

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7670110/

https://www.who.int/emergencies/diseases/novel-coronavirus-2019/technical-guidance/naming-the-coronavirus-disease-(COVID-2019)-and-the-virus-that-causes-it

https://www.rcpjournals.org/content/futurehosp/9/2/113

https://doi.org/10.1016%2Fj.icte.2020.10.002

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9151356/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7908833/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8285156/

https://pubmed.ncbi.nlm.nih.gov/32665978

https://doi.org/10.1016%2Fj.ijin.2022.05.002

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8669585/

https://scholar.google.com/scholar_lookup?journal=Med.+Image+Anal.&title=Transformers+in+medical+imaging:+A+survey&author=F.+Shamshad&author=S.+Khan&author=S.W.+Zamir&author=M.H.+Khan&author=M.+Hayat&publication_year=2023&pages=102802&pmid=37315483&doi=10.1016/j.media.2023.102802&

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8421632/

https://www.who.int/docs/defaultsource/documents/gs4dhdaa2a9f352b0445bafbc79ca799dce4d.pdf

https://www.bbc.com/news/health-42357257

https://www.ibm.com/blogs/research/2017/1/ibm-5-in-5-our-words-will-be-the-windows-to-our-mental-health/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10057336/

https://doi.org/10.48550%2FarXiv.2110.14731

https://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.0020124

https://scholar.google.com/scholar_lookup?journal=Proceedings+of+the+IEEE+15th+International+Symposium+on+Biomedical+Imaging&title=How+to+fool+radiologists+with+generative+adversarial+networks?+A+visual+turing+test+for+lung+cancer+diagnosis&author=M.J.M.+Chuquicusma&author=S.+Hussein&author=J.+Burt&author=U.+Bagci&pages=240-244&

https://pubmed.ncbi.nlm.nih.gov/23443421

https://www.nuffieldbioethics.org/assets/pdfs/Artificial-Intelligence-AI-in-healthcare-and-research.pdf

https://link.springer.com/article/10.1007/s10916-017-0760-1

Quantum Computing: Unlocking the Complexities of Biological Sciences

First published 2023

Quantum computing is positioned at the cutting-edge juncture of computational science and biology, promising revolutionary solutions to complex biological problems. The intertwining of advanced experimentation, theoretical advancements, and increased computing prowess have traditionally powered our understanding of intricate biological phenomena. As the demand for more robust computing infrastructure increases, so does the search for innovative computing paradigms. In this milieu, quantum computing (QC) emerges as a promising development, especially given the recent strides in technological advances that have transformed QC from mere academic intrigue to concrete commercial prospects. These advancements in QC are supported and encouraged by various global policy initiatives, such as the US National Quantum Initiative Act of 2018, the European Quantum Technologies Flagship, and significant efforts from nations like the UK and China.

At its core, quantum computing leverages the esoteric principles of quantum mechanics, which predominantly governs matter at the molecular scale. Particles, in this realm, manifest dual characteristics, acting both as waves and particles. Unlike classical computers, which use randomness and probabilities to achieve computational outcomes, quantum computers operate using complex amplitudes along computational paths. This introduces a qualitative leap in computing, allowing for the interference of computational paths, reminiscent of wave interference. While building a quantum computer is a daunting task, with current capabilities limited to around 50-100 qubits, their inherent potential is astounding. The term “qubit” designates a quantum system that can exist in two states, similar to a photon’s potential path choices in two optical fibres. It is this scalability of qubits that accentuates the power of quantum computers.

A salient feature of quantum computation is the phenomenon of quantum speedup. Simplistically, while both quantum and randomised computers navigate the expansive landscape of possible bit strings, the former uses complex-valued amplitudes to derive results, contrasting with the addition of non-negative probabilities employed by the latter. Determining the instances and limits of quantum speedup is a subject of intensive research. Some evident advantages are in areas like code-breaking and simulating intricate quantum systems, such as complex molecules. The continuous evolution in the quantum computing arena, backed by advancements in lithographic technology, has resulted in more accessible and increasingly powerful quantum computers. Challenges do exist, notably the practical implementation of quantum RAM (qRAM), which is pivotal for many quantum algorithms. However, a silver lining emerges in the form of intrinsically quantum algorithms, which are designed to leverage quintessential quantum features.

The potential applications of quantum computing in biology are vast and multifaceted. Genomics, a critical segment of the biological sciences, stands to gain enormously. By extrapolating recent developments in quantum machine learning algorithms, it’s plausible that genomics applications could soon benefit from the immense computational power of quantum computers. In neuroscience, the applications are expected to gravitate toward optimisation and machine learning. Additionally, quantum biology, which probes into chemical processes within living cells, presents an array of challenges that could be aptly addressed using quantum computing, given the inherent quantum nature of these processes. However, uncertainties persist regarding the relevance of such processes to higher brain functions.

In summation, while the widespread adoption of powerful, universal quantum computers may still be on the horizon, history attests to the fact that breakthroughs in experimental physics can occur unpredictably. Such unforeseen advancements could expedite the realisation of quantum computing’s immense potential in tackling the most pressing computational challenges in biology. As we venture further into this quantum age, it’s evident that the fusion of quantum computing and biological sciences could redefine our understanding of life’s most intricate mysteries.

Links

https://www.nature.com/articles/s41592-020-01004-3

https://ts2-space.webpkgcache.com/doc/-/s/ts2.space/en/decoding-the-quantum-world-of-biology-with-artificial-intelligence/

The Biological Significance of Transport in Animals and Plants

First published 2023

Ultimately, all living organisms require nutrients and glasses for respiration or photosynthesis to produce energy, which allows the organism to carry out metabolic processes. This can take place either through simple diffusion (for unicellular organisms such as amoeba where gasses simply diffuse across the cell membrane down a concentration gradient) or for multicellular organisms, through a complex transport system (such as that within mammals). This is determined by the surface area to volume ratio of the organism; the smaller the surface area to volume ratio, the slower the necessary molecules would be transported by simple diffusion, causing a need for a transport system. Both plants and animals use transport systems to transport food molecules with a key difference being that plants do not use the transport system to fight disease.

Oxygen is essential within humans in order to produce ATP. Within red blood cells, a protein called haemoglobin is used to transport oxygen around the body. Haemoglobin molecules have a quaternary structure made from 4 polypeptide chains. Each chain is attached to a haem group composed of porphyrin 3 attached to a ferrous iron ion in each red blood cell. Overall, haemoglobin molecules bind (associate) to oxygen as each of these 4 ions can combine with a single oxygen molecule allowing up to 4 oxygen molecules in each red blood cell at once. Oxygen association happens as soon as oxygen enters the bloodstream via diffusion through the alveoli in the lungs. It also releases (dissociates) oxygen via diffusion through the red blood cells across the capillaries into skeletal muscle cells. Haemoglobins with a high affinity for oxygen take up oxygen more easily but release it less easily.

Additionally, haemoglobins with a low affinity for oxygen take up oxygen less easily but release it more easily. This is extremely useful as haemoglobin is remarkably able to change its affinity for oxygen by changing its shape (through slightly different amino acid sequences creating different tertiary and quaternary structures) under different conditions. For example, in the presence of carbon dioxide, the heamoglobin molecule changes shape, causing it to bind more loosely to oxygen. As a result, haemoglobin releases its oxygen. Furthermore, the transport of oxygen through haemoglobin is extremely significant as it allows the movement of oxygen from the lungs, through the bloodstream and into muscle tissue for aerobic respiration to release energy for muscle contraction.

Interestingly, under anaerobic conditions, muscles generate lactic acid so quickly that the pH of the blood passing through lowers to around 7.2 causing haemoglobin to produce around 10% more oxygen. This is called the Bohr Effect and happens because it is in the presence of lactic acid, there is less oxygen so haemoglobin has a lower affinity for oxygen so it is dissociated.

Additionally, glucose is another vital molecule needed in humans for aerobic respiration which happens through the combination of glucose and oxygen to produce carbon dioxide and water as waste products. Within the human digestive system, active transport and absorption of glucose take place within the ileum. Glucose is transported by active transport

from the gut into intestinal epithelial cells but facilitated diffusion across the membrane of red blood cells. Glucose is absorbed in the ileum through facilitated diffusion (which is a form of passive transport unlike active transport). This is diffusion involving the presence of protein carrier molecules which allow the passive movement of substances across plasma membranes (in this case through the membrane of microvilli on an epithelial cell surface membranes in the ileum).

Microvilli are extensions on epithelial cells which increase the surface area for the insertion of carrier proteins through which diffusion, facilitated diffusion and active transport can take place. Alternatively, more protein channels and carrier proteins can be used to increase the rate of movement across membranes. Due to the constant digestion of carbohydrates and proteins (through the breakdown of disaccharides such as maltose and polysaccharides such as starch to produce glucose monomers), there is usually a higher concentration of glucose molecules in the ileum than in the blood. The rich blood supply around the ileum helps maintain this steep concentration gradient which also increases the rate of facilitated diffusion. Within facilitated diffusion, the carrier proteins bind to glucose, causing them to change shape and translocate the glucose from one side of the membrane to the other. This is vital for some organisms (including humans) because it allows glucose to be extracted from the food we eat and usefully transferred into the blood where it is transported to the mitochondria in muscle cells for aerobic respiration.

Furthermore, active transport is the movement of molecules or ions into or out of a cell from a region of lower to a region of higher concentration using ATP and carrier proteins. The ATP is needed to individually move molecules against a concentration gradient. The process of active transport is used within humans through carrier protein molecules which act as ‘pumps’ such as a ‘sodium potassium pump’. Active transport of a single molecule or ion involves the receptor sites of a carrier protein binding to one side of the molecule or ion. ATP also binds to the protein, causing it to split into an ADP and a phosphate molecule. As a result, the protein molecule changes shape and opens the opposite side of the membrane.

The molecule or ion is then released to the other side of the membrane. Finally, the phosphate molecule is released from the protein to revert to its original shape, ready for the process to be repeated. The phosphate molecule then recombines with the ADP to form ATP during respiration. This same mechanism is used specifically within a sodium potassium pump where more than one molecule is moved in the opposite direction at the same time by active transport. Powered by ATP, the pump moves sodium and potassium ions in opposite directions, each against its concentration gradient.

In a single cycle of the pump, three sodium ions are extruded from and two potassium ions are imported into the cell. This pump is vital for maintaining perfect ion concentrations inside and outside of cells to allow biological processes including heart contractions and kidney functions to take place. Also, a sodium potassium pump is used within the brain to transmit signals using a flow of sodium and potassium ions which produce an electrical spike called an action potential.

Within plants, organic molecules and mineral ions are transported from one part of a plant to another through a process called translocation through phloem vessels. Phloem is made up of sieve tube elements with perforated walls forming sieve plates combined with companion cells. These are connected to the sieve tube members through plasmodesmata (a thin layer of connecting cytoplasm). Companion cells have many mitochondria in order to produce energy for translocation in the phloem. Companion cells also may regulate translocation.

Translocation is potentially achieved through the mass flow theory. Mass flow is the bulk movement of a substance through a given channel or area in a specified time. Mass flow of sucrose through sieve tube elements takes place firstly takes place when sucrose produced by photosynthesizing cells is actively transported through sieve tubes which causes the sieve tubes to have a lower water potential. As the xylem has a less negative water potential, water moves from the xylem into the sieve tubes by osmosis which creates a hydrostatic pressure within them. At the respiring cells, sucrose is either used up during aerobic respiration or is converted to starch for storage. These cells therefore have a low sucrose content and so sucrose is actively transported into them from the sieve tubes lowering their water potential.

Water also moves from the sieve tubes into respiring cells by osmosis. This lowers the hydrostatic pressure resulting in a high hydrostatic pressure at the source and a low one at the sink. The relative pressures within sieve tubes act as evidence in favour of this theory however not all solutes move at the same speed which suggests the theory may not be completely correct. The dissolved sugars transported by the phloem is responsible for affecting plant growth including seed, leaf and fruit development.

Finally, water is transported in plants through extensions called root hairs. In flowering plants, this water is further transported up and around the plant through xylem vessels through a process called transpiration. The energy required for this passive process is supplied by the sun. Xylem vessels transport water from the roots to the leaves through cohesion-tension. This is when water evaporates from mesophyll cells due to heat from the sun leading to transpiration. Cohesion happens when molecules form hydrogen bonds with each other and tend to stick together. Water forms a continuous, unbroken column across the mesophyll cells and down the xylem.

As water evaporates from the mesophyll cells in the leaf into the air spaces beneath the stomata, more molecules of water are drawn up due to cohesion. A column of water is therefore pulled up the xylem as a result of transpiration. This is known as transpiration pull. This puts the xylem under tension, meaning there is a negative pressure within the xylem. Xylem vessels therefore allow water to move from the roots up the plant. This is useful as the water can contain water soluble nutrients which are beneficial to the plant for growth but additionally a high water content ensures hypotonic cells which increase turgor pressure preventing the plant from drooping. Water is also involved in the leaves during photosynthesis to produce energy.

In conclusion, transport systems are very beneficial to both multicellular plants and animals as they allow essential nutrients, molecules, ions and glasses to be used efficiently within the organism.

Links

http://www.biology-resources.com/drawing-amoeba-breathing.html.

https://www.s-cool.co.uk/a-level/biology/transport/remember-it/s-cool-revision-summary. https://www.britannica.com/science/hemoglobin.

https://en.wikipedia.org/wiki/Bohr_effect

https://www.encyclopedia.com/plants-and-animals/botany/botany-general/companion-cell

https://sciencing.com/can-glucose-diffuse-through-the-cell-membrane-by-simple-diffusion- 12731920.html.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4274886/

Chemistry Beyond Earth: Cosmic Ice and Its Implications for Life

First published 2022; revised 2023

The vast expanse of our universe presents intriguing mysteries that push the boundaries of our Earthly experiences. One of the most compelling phenomena that challenge our conventional understanding is the chemistry of cosmic ice. This unique form of ice, as revealed by research at NASA’s Goddard Space Flight Center and supported by extensive literature, provides profound insights into the fundamental differences between the chemistry we observe on Earth and that which unfolds in the cosmos.

At the heart of this contrast is the distinctive environment in which these chemical reactions take place. Earthly chemistry operates within familiar parameters of temperature, pressure, and atmospheric conditions. Conversely, the extreme conditions of space are characterised by intense cold, potent radiation, and near-vacuum environments. Such extremities lead to reactions primarily involving gases or solids, given the general absence of liquids in the vast interstellar void.

The mystery of cosmic ice, however, goes beyond these environmental differences. Research led by Perry Gerakines at NASA’s Goddard Space Flight Center’s Cosmic Ice Lab replicates the conditions of outer space to study the exotic amorphous ice. This unique state of water is unlike the familiar crystalline structure found on Earth. Instead, amorphous ice, being disorderly and erratic, is formed under intense cold and low pressures. This deviation in the structure of something as fundamental as ice underscores the contrasting chemistries of space and our home planet.

But the significance of cosmic ice isn’t limited to its intriguing structure. It serves as a cornerstone for a widely accepted hypothesis about the origins of life on Earth. Many scholars propose that vital molecular precursors to life were formed in space, either within the dense interstellar medium (ISM) or later in the Solar Nebula. These molecules, shaped and transformed by their journey through low-temperature ices exposed to various ionizing radiations, were eventually delivered to Earth through celestial events. This interstellar voyage of molecules, coupled with the chemistry of cosmic ice, may have played a critical role in seeding life on our planet.

Deep space, far from being a desolate expanse, teems with complex organic molecules. Astronomers have identified intricate molecules, such as ethylene glycol, in the gas phase of the ISM and even within comets in our Solar System. While observing such vast molecules is fraught with challenges due to overlapping rotational lines in the spectra, our endeavours to find prebiotic molecules have been fruitful. The discovery of amino acids in meteorites like Murchison and in cometary dust particles from Comet 81P/Wild 2 provides tantalising evidence of the potential life-precursors that dance through the cosmos.

Astrochemical studies offer a fascinating window into the survivability of organic molecules in cosmic conditions, especially amino acids. The quantified “half-life doses” of specific amino acids, such as glycine, alanine, and phenylalanine, reveal how long these molecules can endure in extraterrestrial radiation environments. In dense ISM regions, for instance, amino acids are likely to have half-lives of around 107 years, aligning with the expected life cycle of an interstellar cloud core before it collapses into a protostar. The diffuse ISM, with its heightened cosmic-ray fluxes, can considerably reduce these half-lives.

The longevity of these amino acids is not just contingent on the region of space but also on the depth within extraterrestrial bodies. On Oort-cloud comets, the amino acids near the surface have a half-life ranging from 106 to 108 years. Pluto’s subsurface conditions suggest even more extended amino acid half-lives, ranging from 1–4 × 108 years. On the other hand, Europa, one of Jupiter’s moons, presents a starkly different picture. Its radiation-rich environment implies that amino acids on its surface might last only a few years. As we delve deeper into Europa, below 1 metre, the half-lives extend to potentially 6–10 million years. Mars offers yet another contrast with surface amino acid half-lives of about 108 years due to proton bombardment.

The implications of these findings are profound. The presence of amino acids in meteoritic and cometary samples suggests that they have been effectively shielded from cosmic radiation, perhaps by H2O ice or other substances. It’s tantalising to consider that if these amino acids were formed in the dense ISM, they could endure the collapse of a dense cloud core into a protostar and integrate into the primordial materials forming comets or planetesimals. Within our Solar System’s icy realms, many amino acids could survive for tens to hundreds of millions of years, provided they are embedded a few centimeters deep. It raises the prospect of detecting recently formed molecules at such depths in future exploration missions.

Moreover, the observed infrared spectra of amino acids deposited at low temperatures (15 K) reveal an intriguing conversion behaviour. The shift from the non-zwitterion to the zwitterion upon heating suggests that extraterrestrial amino acids might predominantly exist in the zwitterionic form if their icy environments ever experience temperatures of 140 K or higher. It’s essential for experimental designs studying the half-lives of these amino acids in cosmic settings to account for this transformation. The protective effect of H2O-ice on amino acids further emphasises the significance of understanding the interplay between organic molecules and their surrounding environments. Amino acids seem to thrive best within specific depth thresholds in icy celestial bodies, offering a tantalising hint at where we might focus our explorative endeavors in the quest to understand life’s cosmic origins.

In piecing together these findings, we begin to appreciate the profound implications of the chemistry of cosmic ice. Not only does it serve as a testament to the vast differences between Earthly and cosmic chemistries, but it also reshapes our understanding of the origins of life and the boundless potential of the universe. As we continue to delve into these cosmic mysteries, they not only intrigue and challenge our current knowledge but also inspire awe at the intricate dance of molecules across the vast canvas of space.

Links

https://www.sciencedirect.com/science/article/pii/S0019103512002187

https://www.mpiwg-berlin.mpg.de/research/projects/DeptIII-ChristinaWessely-Welteislehre

StereoChemistry: Enantiomers and Racemic Mixtures

First published 2022

Stereochemistry is an essential branch of chemistry that focuses on the study of the three-dimensional structures of molecules. One of the fundamental concepts in stereochemistry is that of enantiomers and their specific optical properties. Enantiomers are pairs of chiral compounds with exactly the same connectivity, but opposite three-dimensional shapes. Importantly, enantiomers are not the same as each other; one enantiomer cannot be superimposed on the other. However, enantiomers are mirror images of each other, so contain the same quantity of atoms, in the same ratio, just with a different 3D orientation in space. They have the same melting point, the same solubility, and so on. Two compounds that are almost identical, but mirror images of each other, have exactly the same kinds of intermolecular attraction, so unsurprisingly, their physical properties are identical. It can be shown using group theory, the mathematics of symmetry, that an enantiomer may also be defined as a molecule that does not contain a mirror plane, meaning it cannot be divided into two identical and opposite halves. Chirality is often illustrated with the idea of left- and right-handedness in that a left hand and right hand are mirror images of each other but are not superimposable.

The ability of enantiomers to rotate the plane of polarised light distinguishes them. Dextrorotatory enantiomers will rotate plane-polarised light (the light waves are moving in only a singular direction) a certain magnitude of degrees in one direction, i.e., (+)clockwise. However, levorotatory enantiomers will rotate the plane-polarised light the same magnitude of degrees, but in the opposing direction i.e., (-) Anti-clockwise. The direction that the enantiomers rotate plane-polarised light in can be denoted with a plus (+) or a minus (-) symbol, where clockwise rotation is a plus (+) and anticlockwise is a minus (-). This optical rotation is due to the interaction of the plane-polarised light with a substance, whereby, if the substance is not symmetric, the light will have had its direction rotated upon leaving the material. i.e., the direction of the oscillation of the transverse light waves changes. This is an inherent property, but the magnitude at which the light is rotated is dependent on the substance. These properties of rotating light can be used to distinguish the types of enantiomers and the relative concentration in a given sample. These enantiomers can be separated to form an enantiopure, but this is an energy and time consumptive process.

One unique scenario is when a sample contains an equal mixture of these enantiomers. This is termed a racemic mixture. A racemic mixture is a mixture composed of equal quantities of (+) dextrorotatory and (-) levorotatory enantiomers; hence, such mixtures will not exhibit a net rotation of plane-polarised light as each enantiomer’s rotation of the plane-polarised light is cancelled. This cancellation can be conceptualised as having a sine wave, and a duplicate sine wave, but with an inverted phase; the two waves interact and cancel, and the net result is no oscillation. This is also the primary principle behind active noise cancellation in headphones. The presence of a racemic mixture in a sample underscores the importance of being able to distinguish and separate these unique forms of chiral molecules, as they may have vastly different effects in various applications, including drug development and organic synthesis.

The applications of stereochemistry, particularly the study of enantiomers, span across various fields. Perhaps one of the most critical areas is in the pharmaceutical industry. Many drugs are chiral molecules, meaning they have enantiomers. It’s crucial to understand that while these enantiomers are chemically similar, they can have dramatically different biological effects. One enantiomer might be therapeutic and beneficial, while its mirror image could be inert or even harmful. Thalidomide is a notorious example. One enantiomer was effective as a sedative and anti-nausea medication, particularly for pregnant women, but its mirror image caused severe birth defects. Thus, the ability to produce and identify enantiopure drugs is paramount for patient safety.

Additionally, in the world of fragrances and flavours, chiral molecules play a pivotal role. Enantiomers can have different smells or tastes even though they are chemically almost identical. For instance, one enantiomer might smell like lemons while its mirror image has the aroma of oranges. Such subtleties can make a significant difference in the formulation of perfumes, food products, and beverages. Moreover, in the field of materials science, understanding the stereochemistry can be crucial for the design and synthesis of new materials with specific properties, given that the spatial arrangement of atoms can influence attributes like strength, flexibility, and reactivity.

In conclusion, stereochemistry, and especially the study of enantiomers, provides a deep insight into the subtle nuances of molecular structures. The seemingly minor differences in three-dimensional orientation have profound effects on physical properties, biological activity, and overall functionality. Recognising and harnessing these differences have been and continue to be essential for advancements in medicine, food and beverage industries, and materials science. As our understanding of stereochemistry grows, so will our ability to tailor molecules for specific needs, enhancing the quality of products and, by extension, our lives.

Links

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC353039/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5765859/

https://cmtext.indiana.edu/acoustics/chapter1_phase2.php

https://www.sydney.edu.au/science/chemistry/~george/isomers.html

Thalidomide and Its Historical and Modern Implications

First published 2021

Thalidomide is a sedative drug that was developed and marketed in the 1950s and sold worldwide to pregnant women as a treatment for morning sickness and nausea. The drug was developed and tested by the German pharmaceutical company, Chemi Grunenthal. This seemed like a breakthrough at the time, particularly for pregnant women who had been suffering from the uncomfortable symptoms of nausea.

Thalidomide had been tested on rodents, and no median lethal dose was found. Hence, the drug was deemed safe. By today’s standards, the extent of this testing would be deemed very poor. Crucially, there was no form of testing on pregnant women (or even on gestating animals) to ensure the drug had no teratogenic effects (causing harm to an unborn foetus), and this is especially pertinent, as the drug’s primary purpose was to treat nausea in pregnant women. Over the next few years, there were many reports of congenital malformations in new-born children, and eventually, the link was drawn between these congenital defects and Thalidomide.

An estimated 10,000 infants were affected worldwide, with many additional uncounted still births and miscarriages. An estimated 40% of these effected infants did not live past the first year of life. Thalidomide caused a range of birth defects not limited to absence of the auricles with deafness, defects of the muscles of the eye and face, absence or hypoplasia of arms, especially affecting the radius and the thumb, thumbs with three joints, defects of the femur and tibia, as well as malformations of the heart, the bowel, the uterus, and the gallbladder. It has later been shown that the majority of malformations occurred when the drug was ingested by the mother between the 34th and the 50th day after the last menstruation. Specific deformities correlated to specific days of exposure, such as absence of ears and deafness occurring between the 35th and 37th day, and thumbs with 3 joints forming around the 46th to 48th day.

After the discovery of a string of birth defects, scientists worked frantically to understand the mystery unfolding in front of them, but an answer wouldn’t be reached until many years later. As with many pharmaceutical drugs, Thalidomide is a racemic mixture, meaning it has an equal quantity of its (+)/(-) enantiomers per unit volume of the drug. The function of each enantiomer in a racemic drug is wholly dependent on the drug itself. For instance, the (+) enantiomer might be the active component, producing the desired effect, while the (-) enantiomer might be inert. In the case of Thalidomide, the (R)- enantiomer has sedative effects, while the (S)- enantiomer is teratogenic, causing congenital defects.

Years of research still leaves some ambiguity around the exact cause of the drug’s teratogenic effects. The drug’s half-life is between 8-12 hours, and it can undergo hydrolysis in bodily fluids, as well as being metabolised by the cytochrome p450 enzyme. It’s believed the teratogenesis arises from a combination of the (S)- parent molecule and its breakdown products. Likely, the drug breaks down inside the mother, with the resulting products then passing across the placental membrane to the foetus. A particular breakdown product, CPS49, in conjunction with Tubulin and the (S)- parent molecule, is believed to inhibit angiogenesis – the creation of new blood vessels. This results in stunted foetal vascular development and consequently, congenital defects.

Even if the enantiomers of Thalidomide were separated, Thalidomide Embryopathy would still have occurred. Under biological conditions, the drug can convert (racemise) between its enantiomeric states, making a non-teratogenic, stable form nearly impossible to produce. Animal testing also revealed diverse reactions; for instance, mice and rodents weren’t affected as severely as rabbits, chickens, and human foetuses.

In modern medicine, the Thalidomide scandal has led to rigorous drug testing and development reforms. Drugs now undergo extensive trials, and any side effect must be promptly reported. Interestingly, Thalidomide is now used in treating conditions like Leprosy and Multiple Myeloma because of its antiangiogenic properties. Regrettably, black market trading and information deficits have resulted in continued Thalidomide-related congenital defects, particularly in Brazil.

The Thalidomide disaster is an indelible mark on medical history, causing over 10,000 infants to be born with defects. As medicine progresses, we can only aspire to prevent such tragedies in the future.

Links

https://pubchem.ncbi.nlm.nih.gov/compound/Thalidomide

https://thalidomide.ca/en/what-is-thalidomide/

https://www.acs.org/molecule-of-the-week/archive/t/thalidomide.html

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC43727/

https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1755-3768.1993.tb04997.x

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5835787/

https://pubmed.ncbi.nlm.nih.gov/26043938/

https://pubmed.ncbi.nlm.nih.gov/3067415/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3573415/

https://www.hyle.org/journal/issues/22-1/ruthenberg.pdf