Abstract About 422 million people worldwide have diabetes. It has no cure so far. But when it comes to biomedical engineering, we can see a glimmer of hope. Now, you are not committed to taking continuous insulin doses because your artificial pancreas will do that for you. The artificial pancreas facilitates the patient's disease journey both psychologically and physically. It is equipped with a blood glucose monitor and an insulin pump, making it like a natural pancreas in its functionality. But behind every invention that is beneficial to humanity are complications and sometimes algorithms.
I. Introduction
According to WHO, diabetes affects around 422 million people globally. Diabetes is directly
responsible for 1.6 million deaths per year. Over the last few decades, both the number of cases and
the prevalence of diabetes have significantly increased. It causes severe problems such as
blindness, kidney failure, heart attacks, stroke, and lower limbamputation
II. The Incurable Disease
i. Brief about the Pancreas
The pancreas is a pear-shaped vital part of the digestive system and is known as a mixed gland
due to having both exocrine and endocrine functions. The endocrine portion consists of a group of
different types of cells (islets of Langerhans). The islets of Langerhans cells are three types,
Alpha, Beta, and Delta. Each of these cells secretes a specific hormone. For instance, Alpha cells
secrete glucagon that raises blood glucose levels raises blood glucose levels. Besides, Beta cells
secrete insulin that regulates sugar metabolism and maintains normal sugar levels in the
blood.
ii. Type 1 Diabetes
When you eat carbohydrates, chemicals in your small intestine break them down into single sugar
molecules called glucose. The small intestine absorbs glucose. Then, the glucose starts to transfer
into the bloodstream. When the bloodstream reaches your pancreas, beta cells detect the rising
glucose levels and release insulin into your bloodstream to decrease glucose levels and keep your
blood glucose in a normal range.
III. The Glimmer of hope
i. Artificial Pancreas
So, if you are a type 1 diabetes case, your objective is to maintain a healthy blood glucose
level by inserting the dosages of insulin required to lessen the blood glucose level. Thus, the
artificial pancreas plays a role in this series of functions by making it continuously
ii. Types of Artificial Pancreas
So, using an artificial pancreas (AP) is essential for people who experience diabetes, especially Type 1 Diabetes. However, many insulin delivery systems exist, each with its mechanisms, algorithms, and certain returns. When discussing kinds of artificial pancreases, there are mainly three types according to the official UK site for diabetes.
IV. Closed-Loop Systems: A Close Look
i. Historical Appearance
The closed-Loop glucose system idea is not new. The concept was introduced in the 1960s
ii. The Algorithms in CLS: Natural introduction
While developing the artificial pancreas, scientists and engineers certainly were inspired by
the systems of equations modeling the natural pancreas secretions behavior. However, it is
impossible to use these systems because they model the “behavior” of the pancreas that they are not
instructions that can be followed. Also, artificial pancreas systems cannot always measure the
concentrations of the hormones and glucose accurately and instantly. Subsequently, prediction models
and error-correcting algorithms are necessary to supply the human body with stable glucose rates,
and that’s where the importance of the control algorithms comes
iii. The Main Algorithms in the CLS
Many algorithms are frequently used in the CLS control devices. In addition, algorithms can be implemented in different ways. They are just principles. There are three main types of algorithms, Model Predictive Control, Proportional–Integral- Derivative, and Fuzzy Logic Control: Here, we will present these main strategies and algorithms involved in technology.
The parameters used in the system of ODEs can be fixed or adaptive, while adaptive values may
sound promising. They must be applied with care because they may result in unstable
systems.
Correcting the model according to the measurements differs from predictions. It treats the
difference between the measured output and the model prediction at the current step as a constant.
The correction happened only after the prediction. However, better approaches that make smart but
more complex corrections like the Kalman filter.
V. Comparison, Which Approach is Better?
MPC ODEs can be very complicated or relativelysimple. The gain constant in PID can have
differentvalues. Modifications like IFB and FMPD (fadingmemory proportional derivative) can be
executed tomodify the behavior of PID.
On the other hand, FL depends mainly on theexperience of the person defining the operationrules.
Boquete, a strong MPC proponent noticedthat and mentioned it explicitly at the end of hisdiscussion:
“I also agree that it is very difficult,even in simulation studies, to have a validcomparison of
different algorithms. One way oranother, an algorithm must be tuned based on someperformance
criterion, so if particular MPC andPID algorithms are tuned with different criteria,then there is
no good way to compare them.”
VI. Results
According to a 6-month trial by the National Institute of Diabetes and Digestive and Kidney
Diseases. It was testing the CLS on the Diabetes type 1 patients for targeting glycemic
range.
During the trial, patients who utilized the closed-loop device had lower glycated hemoglobin
levels. The closed-loop system had beneficial glycemic effects during the day and at night, with the
latter being more noticeable in the second half of the night. The glycemic advantages of closed-loop
management were observed in the first month of the study and lasted for six months. The study
population included both insulin pump users and injectable insulin users spanning a wide age range
(14 to 71 years) and baseline range of glycated hemoglobin levels (5.4 to 10.6 percent),
with similar results across these and other baseline features of the cohort.
Patients had to be at least 14 years old and have a clinical diagnosis of type 1 diabetes; they
also had to have been treated with insulin for at least a year using a pump or several daily
injections, with no restriction on the glycated hemoglobin level. The experiment included a
2-to-8-week run-in phase (the length of which depended on whether the case had before used a pump or
constant glucose monitor) to gather baseline data and teach patients how to use the devices.
The use of a closed-loop system was linked to a higher percentage of time spent in a target
glycemic range than the use of a sensor-augmented insulin pump in this 6-month experiment, including
type 1 diabetes patients.
VII. Future Research
It is important to declare that a few problems and gaps in the development of the CLS were not
in the previous studies and research. We will address these problems here and introduce probable
solutions for them:
Inconsistent or ambiguous algorithm standards: a great portion of the studies not mentioned in
detail. The implementation of the algorithm's methods in their research design and methods section
or even adhere to them in some cases as the control algorithm may differ from the simulation
algorithm in a simulation study. Examples of standards that should be well acquainted with are the
applied parameters tuning, the algorithm version used, the number of steps, and their duration in
the control and prediction horizons in MPC. This consistency problem causes inaccurate
information and sometimes underestimating an algorithm due to unfair methodology. Clear information
about algorithms and precise methodology documentation should be a necessity in any study that
includes experiments whether they are on real subjects or simulations.
Not enough / inadequate data: A major problem with regards to the research done on AP till now
is the eminent lack of experiments on real subjects. Most of the studies of AP made so far have
included 7 – 10 persons at maximum. In addition, the scarcity of experiments. It might be surprising
that the first outpatient study made based on MPC was reported in 2014. Absence of diversity in the
studies made because they are usually made in North America on adult North American patients.
Diversity in characteristics of subjects plays an important role as a factor to ensure that a
medical procedure is suitable and beneficial for a wide range of people and conditions.
Unfortunately, a device like AP may not work well and provide desired stable glucose levels for
Asian children.
Diverting attention to AP studies, increasing their number as well as widening the health
conditions, and geographical range of them will lead to the accuracy of the information
available.
Possible improvements for the algorithms:
Although the algorithms used in the AP CLS have
greatly improved and got better since the CLS's first introduction in 1964, they are still not close
to being perfect, and there is still definitely a place for improvement.
Taking PID as an example, a deviation measure with its specific experimentally obtained
parameter can be introduced to the algorithm along with IFB to favor staying in the target range
overachieving smooth glucose changes to avoid nocturnal hypoglycemia and hyperglycemia. The added
existence of the IFB modification is essential to provide data about the insulin delivery history
and make predictions for the insulin and glucose levels to allow the deviation measure to function
efficiently. Research on introducing a deviation measure may provide valuable insight into the
potential of this modification.
A possible suggestion concerning FLC is developing a mobile application that connects the FLC
database in the AP with the clinician computer and provides regular data about the insulin and
glucose levels history. This app can allow the clinician to communicate with the patient and keep an
eye on his condition wirelessly. The interesting part is that FLC is highly customizable and
operates like ours using linguistic rules. The app could achieve a special benefit as it allows the
clinician to monitor the patient status and modify the operation rules of the FLC through his
computer according to the patient individual statistics and update them to be adjusted to the
patient's needs. Such a mobile app can increase the efficiency of the algorithm and facilitate the
communications between the clinician and the patient and between the clinician and the FLC. The app
may attain pleasing and useful results.
VIII. Conclusion
For years, scientists have studied the natural pancreas and have been able to mathematically simulate its function using various methodologies and complexity levels. The mathematical model created by Banzi and coauthors is one of the dependable and not overly complicated models. Scientists' primary objective in developing the algorithms of the closed-loop system was to create an artificial pancreas that performs similarly to a natural one. Because of the high potential benefits of the closed-loop system, we decided to review it. Moreover, it has been the subject of several studies and tests. It consists of an insulin pump (the small cuboid-like apparatus) that may have the control algorithm inserted, a GCMS (the pieces at the bottom-right), and a communication device that allows doctors and experts to monitor the device and patient's condition. However, the development of algorithms and the addition of complexity to increase CLS performance in actual settings was and continues to be a key component of CLS and AP technologies. According to the volunteers that used the CLS, they completely forgot that they have diabetes, and the results were unexpected on the other hand, CLS has few problems and gaps, so we reconsidered these problems and find solutions to increase the efficiency of the project. AP aims to change the life of the patients mentally, psychologically, socially to make about half a million people get their work done without any suffering.
IX. References
Abstract Gene therapy is a potential treatment of many incurable, lethal, and chronic diseases as psychiatric disorders. Competing with other kinds of medications, gene therapy -also known as gene alteration- has been seen as a prospective therapeutic solution as genetics contributes intensively to the origin of many disorders. The emergence of gene therapy was accompanied by controversial arguments about its unknown side effects and effectiveness that impeded the development of gene therapy. After a lot of experiments on mice and deliberate research in the world of genomics, the first genetic therapy on the human body was conducted on a young girl who was diagnosed with a rare genetic disorder in 1990. The treatment went successfully, and it has spurred the implementation of gene therapy in numerous health issues. Among many diseases that can be treated using gene therapy, psychiatric disorders are the most prominent as they are profoundly affected by gene defects. Depression, Bipolar, Alzheimer’s, and OCD are examples of mental issues caused by defections in certain genes.
I. Introduction
For therapeutic purposes, genetics would be an effective solution. Gene therapy is the
modification or manipulation of the expression of a certain kind of genes inside the cell body to
change its biological behaviors to cure a specified disorder. But what makes biologists think of
using genetics instead of using familiar medications such as pharmaceuticals?
The reason is that genetics have a profound, direct contribution to most diseases -genes can
mutate during the growth of the body, and genes could be missed from the moment of birth also. Such
genetic problems could disrupt a chronic health issue.
Gene therapy presents a promising attempt to treat different diseases such as leukemia, heart
diseases, and diabetes. Also, gene therapy could be used to improve the immunity of a body during
its fighting with immune destructive disorders like HIV
II. Gene Therapy Overview
Gene therapy has erupted from the late 1960s and the beginning 1970s when the science of genetics was revolutionized. In 1972, Two genomics scientists Theodore Friedmann and Richard Roblin issued a paper named “A Gene therapy for human genetic disorders?”; their paper was pointing out that a genetic treatment is a potential cure for patients with incurable genetic disorders by merging a DNA sequence into a patient's cells. The paper encountered much disapproval as the side effects of gene therapy were unknown at that time. However, after deliberate research and experiments, in 1990 the first gene therapy trial on the human body went successfully. The therapy was conducted on a young girl who was diagnosed with a deficiency of an enzyme called ADA, making her immune system vulnerable, and any weak infection could have killed her. Fortunately, that trail has paved the way for gene therapy to flourish as a treatment among other types of medications.
III. Gene therapy vs genetic engineering
A renowned misconception is that people think that gene therapy and genetic engineering are
synonymous; nonetheless, they are different technologies. Gene therapy is a technique that aims to
alter the DNA sequence inside malfunction cells to cure genetic defects. On the other hand, Gene
engineering is used to modify the characteristics of a certain gene to enhance its biological
functions to be abnormal. Genetically Modified Organisms are an obvious example of genetic
engineering products. For illustration, the advancement of biotechnological techniques enabled
scientists
to develop a kind of modified cultivated products with certain abilities to cope with human needs
such as a plant with less need for fertilizers and more prolific outcomes
IV. Gene therapy stages
You might have imagined gene therapy as the injection of the patient with a syringe that has a gene to simply substitute the flawed gene inside the cell. Mainly that thought is right, however, the process is not that easy.
V. Types of genetic therapy
There are two types of genetic therapy: somatic therapy and germline therapy. Somatic therapy is
inserting the new gene in a somatic cell (cells that do not produce sperms or eggs). Somatic therapy
does not ensure that the disease will not appear in successive generations, and it requires the
patient to take it several times as its effect does not last long. On the other hand, germline
therapy targets the reproductive cells which produce gametes that later develop into an embryo.
Germline therapy occurs one time in life. It happens either in pre-embryo to treat genetic
defects or it is used to treat a flawed adult sperm or egg before entering the fertilization process
VI. Genetic therapy and psychiatric disorders
The productivity of individuals within their society is determined by their mental conditions.
If they suffer from a mental issue, they will not behave well in their routines. Psychiatric
disorders are a psychological and behavioral defect that causes disturbance in the functions,
feelings, and perceptions of the brain. Ranging from sleep troubles to Alzheimer's, psychiatric
disorders have many different forms, such as Depression, Schizophrenia, Bipolar disorders, and
development disorders like ADHD
VII. What makes gene therapy the expected future approach for most of the psychiatric disorders?
Despite the continuous research and advances in medical treatment methods for various
psychiatric conditions, a large number of patients remain unresponsive to current approaches.
Development in human functional neuroimaging has helped scientists identify specific targets within
dysfunctional brain networks that may cause various psychiatric disorders. Consequently, deep brain
stimulation trials for refractory depression have shown promise. With the procedure and targets
being advanced, that helped scientists use similar techniques to deliver biological agents such as
gene
therapy.
Identification of specific molecular and anatomic targets is important for the development of
gene therapy. In gene therapy, the vehicles used to transfer genes to the neurons in the brain are
modified viruses, called viral vectors. Viruses have the ability to transfer their genetic material
to the target cells. That enables viral vectors to take advantage of that ability. The viral coat of
the vectors is able to deliver a payload with the therapeutic gene efficiently while decreasing the
proteins or viral genes that might cause replication and spread of the toxicity or
inflammation of the virus.
VIII. Gene therapy and p11 protein in treating Depression /Bipolar
IX. Gene therapy pros and cons
Currently, gene therapy research has been ongoing for decades. Researchers say that it could be
used to treat various diseases. However, they had to dive more into it to discover its pros and
cons.
Gene therapy is sometimes better than other treatments because it has many advantages. For
instance, its effects are long-lasting as the defective gene is replaced by a healthy one.
Therefore, that healthy gene is the one that will be transferred to the offspring. Furthermore,
germline gene therapy can be used to replace incurable diseases’ genes in gametes. That results in
eradicating genetic diseases such as Parkinson’s disease, Huntington’s disease, and Alzheimer’s
disease.
Conversely, gene therapy has several cons. For instance, it can go wrong because it is still
under research. The immune system response can lead to inflammation or failure of the organ. In
1999, a clinical trial was conducted at the University of Pennsylvania on an 18-year-old man, who
died at the end. In the clinical trial, the Ad5 vector was used to deliver the gene for ornithine
decarboxylase, a deficient hepatic enzyme. An investigation by the university showed that the man
died due to a massive immune reaction.
X. Enduring with gene therapy is tough because it has many Ethical obstacles
XI. Conclusion
As genetic diseases are increasing rapidly and may result in chronic health issues, gene therapy would be one of the most promising medications. In addition to its significant success in curing diseases such as leukemia, heart diseases, and diabetes, it was discovered that it could contribute to treating psychiatric disorders. Various psychiatric disorders were noticed to have a major genetic component. As a result, research has been ongoing to find whether gene therapy was scientifically appropriate to treat psychiatric disorders or not. Several experiments have been conducted on mice to measure the therapy’s efficiency. Fortunately, these experiments have shown promise. As the research has demonstrated, gene therapy has numerous merits that can benefit humankind. Nevertheless, it has many disadvantages that can result from the reaction of the immune system. Additionally, critics affirm that the therapy could go beyond correcting genetic defects. So, ethical issues might emerge as genetic information will not be secured and standards of special capabilities will be put. That can prevent many people from being active members of society. To conclude, gene therapy is a double-edged weapon, however, further research on it will contribute to the eradication of many serious diseases.
XII. References
Abstract For centuries, humans desired to learn and explore outer space, leave this planet, and travel for many years. In recent decades, many steps have been taken in the right direction. As a result, we have successfully conquered outer space. However, the beginning was such a difficult step; as a new frontier is developed, new problems loom over the horizon. Many issues began to arise to oppose the projects of space exploration. It cannot be said that all the problems are associated with the space itself but rather with humans. Because it is related to humans and their health on other planets, medical issues remain the most significant and most severe globally. Just one of these issues is the cardiovascular system challenge. This research is going to delve deeper into the potential risks of the cardiovascular system and how it affects human life on other planets. Furthermore, some helpful suggestions for resolving the issue are provided.
I. Introduction
Africa was the origin of humanity. Nevertheless, we did not all remain there—our forefathers
travelled all across the continent for hundreds of years before leaving. Why? We probably glance up
to Mars and wonder, "What is up there?" for the same reason. Are we able to travel there? Perhaps we
could
II. Risks to the cardiovascular system during space flight assessment
III. Cardiac output to Mars
Gravity affects several components of the circulatory system, including the heart. On Earth, the veins in our legs, for example, struggle against gravity to return blood to the heart. The heart and blood arteries, on the other hand, change in the absence of gravity, and the longer the flight, the more severe the changes.
IV. Experiment
The radiation and low gravity of space also have an impact on the body’s vascular system, causing circulatory problems for astronauts when they return to Earth and an increased risk of heart attack later in life:
Grenon grew the cells and placed them in an environment that approximated a very low gravity
environment. She discovered that a lack of gravity causes a decrease in the expression of specific
genes within the cells that affect plaque adhesion to the vessel wall. While the effects of these
changes aren't entirely evident, it's known that a lack of gravity has an impact on molecular
features.
Furthermore, previous research by Grenon revealed that micro-gravity causes changes among the
cells that regulate energy flow within the heart, potentially putting astronauts at risk for cardiac
arrhythmia.
In 2016, Schrepfer looked into the vascular architecture of mice who had spent time on the
International Space Station, as well as vascular cells produced in micro-gravity on Earth. Her team
is still analyzing their findings, but it appears that the carotid artery partitions have thinned in
mice in the location, maybe due to the lower blood pressure required for circulation due to the
lower gravity.
The researchers also discovered that the aesthetic cells showed changes in gene expression and
control that were similar to those seen in patients with cardiovascular disease on Earth.While those
changes aren't harmful in the microgravity of the Space Station, they have a deleterious impact on
blood circulation on Earth.When astronauts return to Earth's gravity, muscle weakness is just one of
the reasons they can't get up, according to Schrepfer. “They also don't get enough blood to their
brain since their vascular function is impaired,” says the researcher.
There is reason to be optimistic: Schrepfer and her colleagues have discovered a tiny chemical
that prevents the weakening of vascular partitions in mice. She and her team are planning to conduct
protection experiments on people on the inside in the near future.
V. Pharmacological countermeasures
After such extensive experiments, both in space and on the ground, it was determined that
physical therapy should target plasma or blood expansion, autonomic dysfunction, and impaired
vascular reactivity. This can assist in the identification of the most appropriate countermeasures
for orthostatic and physical work performance protection. Several practices included using agents
such as Fludrocortisone or any electrolyte-containing beverages that can increase the amount of
blood circulating throughout the body. In more detail, betaadrenergic blockers can be used to reduce
the degree of cardiac mechanoreceptor activation or to inhibit epinephrine's peripheral vasodilatory
effects. Furthermore, by inhibiting parasympathetic activity, disopyramide can be used to avoid
vasovagal responses. Finally, alpha-adrenergic agonists such as ephedrine, etilephrine, or midodrine
are used to increase venous tone and return while also increasing peripheral vascular resistance
through arteriolar constriction.
VI. Conclusion
Finally, it can be said that traveling into space is now easier than ever, but the challenge now is determining how to achieve the best results and keeping track of self-health during space flights. As seen, medical issues arising from the cardiovascular system are critical due to their vitality in keeping the astronauts alive during their missions. We have seen how differences in gravity can lead to fundamental problems in space, such as changes in the size and structure of the heart muscle, fluid changes, and a drop in blood pressure. In addition to several possible outcomes, an experiment was shown to demonstrate how radiation and low gravity in space can have an impact on the human body, as well as issues with the cardiovascular system. Finally, using gravity calculations between the Earth and Mars, the scientific idea behind the medical problems is clearly proven. This reveals the mysteries underlying those phenomena.
VII. References
Abstract As modern computers technology developed to understand human brain signals, it’s great to use a computer system as an output of brain signals. This developed technology is brain-computer interface (BCI).A brain-computer interface, sometimes called a brain-machine interface or a direct neural interface, is a hardware and software communications system that allows disabled people a direct communication pathway between a brain and an external device system rather than the normal output through muscles. After experimentation, three types of BCI have been developed which are Invasive BCIs, semi-invasive BCIs, and Non-invasive BCIs that differ in their implementation place of the brain. BCI is used in different applications such as gaming applications that provide these disabled people with entertainment depending on their brain signals as well as its use in the medical field and the bioengineering one. In addition, its usage as neurofeedback therapy contributing to the treatment of psychiatric conditions such as ADHD and anxiety.
I. Introduction
Attention deficit hyperactivity disorder (ADHD) is a neurobiological disorder, characterized by
symptoms of inattention, overactivity, and impulsivity
II. An overview of BCI
III. Loop and components
The following are the specifics of the BCI elements: Control paradigm: At this stage, the user can transfer data to the system by pressing a button with the appropriate function, or by moving the mouse in traditional interfaces. However, BCI necessitates the development of a "control model" for which the user can be held accountable. For example, the user may imagine moving a part of the body or focusing on a specific object to generate brain signals that include the user's intent. Some BCI systems may not require intended user efforts; instead, the system detects the user's mental or emotional states automatically. They are classified as active, reactive, or passive approaches in terms of interaction. Measurement: Brain signals can be measured in two ways: invasively and non-invasively. Invasive methods, such as electrocorticogram (ECoG), single microelectrode (ME), or micro-electrode arrays (MEA), detect signals on or inside the brain, ensuring relatively good signal quality. However, these procedures necessitate surgery and carry numerous risks; thus, invasive methods are clearly not appropriate for healthy people. As a result, significant BCI research has been conducted using non-invasive methods such as EEG, magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), and NearInfrared Spectroscopy (NIRS), among others. EEG is the most widely used of these techniques. EEG is inexpensive and portable when compared to other measuring devices; additionally, wireless EEG devices are now available on the market at reasonable prices. As a result, EEG is the most preferred and promising measurement method for use in BCI games. Processing: The measured brain signals are processed to maximize the signal-to-noise ratio and select target features. Various algorithms, such as spectral and spatial filtering, are used in this step to reduce artifacts and extract informative features. The target features that have been chosen are used as inputs for classification or regression modules. Prediction: This step makes a decision about the user's intention or quantifies the user's emotional and mental states. Classifiers such as threshold, linear discriminant analysis, support vector machine, and artificial neural network are commonly used for prediction. Application: After determining the user's intent in the prediction step, the output is used to change the application environment, such as games, rehabilitation, or treatment regimens for attention deficit hyperactivity disorder. Finally, the user is given the predicted change in the application as a response.
IV. BCI Applications
V. Psychiatric disorders
Mental illness or psychiatric disorders are mainly a wide range of health conditions or neurological disorders that can affect your mood, thinking, and even your behavior. It doesn’t mean that people having mental health concerns from time to time have a mental disorder, but a mental health concern is said to be a mental disorder in case of ongoing symptoms that affect the person’s ability to function. A variety of environmental and genetic factors are the main cause of Psychiatric disorders, this includes:VI. ADHD treatment using BCI based games
To treat ADHD using BCI based games, an experiment was conducted in 2015 to study a BCI system
that mainly uses steady state potentials, such that their main target is to improve the attention
levels of people suffering from ADHD
VII. BCI game therapy
Neurofeedback or electroencephalography (EEG) according to its medical term is a kind of
biofeedback, process to learn how to change physiological activity for the purposes of performance
and improving health, that measures brain waves and body temperature in a non-invasive way where it
changes and normalizes the speed of specific brain waves in specific brain areas to treat different
psychiatric disturbances like ADHD and anxiety.
So, it teaches self-control of brain functions by measuring the brain waves then giving feedback
represented in audio or video. The produced feedback depends on the susceptibility and desire for
the brain activities where it's positive or negative of the brain activities are desirable or
undesirable, respectively.
Neurofeedback is adjuvant therapy for psychiatric conditions such as attention deficit
hyperactivity disorder (ADHD), Generalized anxiety disorder (GAD), and phobic disorder
The treatment starts by mapping out the brain through quantitative EEG to identify what areas of
the brain are out of alignment. Then EEG sensors are placed on the targeted areas of your head where
brain waves are recorded, amplified, and sent to a computer system to process the produced signals
and give the proper feedback. Then that brain current state is compared to what it should be doing
VIII. Conclusion
The Neurofeedback (NFB) therapy technique provides the user with real-time feedback on brainwave activity. That activity is measured by sensors in the form of a video display and sound. Brain-Computer Interfaces (BCI) framework consists of five stages that form a closed loop. BCI helps patients with neurological disabilities to re-communicate by prosthetic devices. BCI also provides disabled people through amputated organs with the entertainment they need; as they mostly feel bored. Gaming BCI allowed these people to get entertained through games that don’t need effort, only brain signals. Psychiatric disorders are mental health issues that can affect your mood and actions. A variety of things are capable of causing these disorders. Inherited traits & environmental exposures can be reasons for psychiatric disorders. Psychotherapy is one of the ways to get rid of mental health issues. In most cases, psychotherapy is completed successfully in a few months, but in many cases, a life-long treatment is needed. Brain-stimulation treatment can be used to get rid of depression, but as it is the age of technology, BCI is now capable of getting rid of those disorders. Depression, substance abuse, anxiety, and mood instability; all these are disorders that NFB has shown effectiveness in. There is proof that supports the idea that NFB decreases seizures. Some proof also supports the effectiveness of NFB for ADHD.
IX. References
Abstract Consciousness is a subjective (implicit) experience. Artificial consciousness aims to simulate this consciousness. This is by building a model as complex as a human brain. Any model less complex than the brain will not be able to simulate the human brain nor a part of it. Building a subject is one of the biggest difficulties because scientists till now don’t know what specifically a subject is. Consequently, it is impossible to build something you don’t know it. Many attempts tried to build machines able to do tasks with the same proficiency as humans. Many attempts succeeded as deep blue that beat the chess world champion Garry Kasparov, but this didn’t reach human consciousness yet. It just follows specific complex commands. This category of machines lacks emotions, love, creativity, desire, and curiosity. Now, scientists try to model the brain by RAM which every neural connection (synapse) equals a floating-point number that requires 4 bytes of memory to be represented in a computer. The brain contains 1015 synapses that equal 4 million GBs of RAM. This memory is not available on a computer till now. It is predicted that it will be available near 2029. This idea may fail for any reason, but all researchers, scientists, and technologists believe that artificial consciousness will become a reality someday even in the far future.
I. Introduction
II. Consciousness, biological process, or psychological concept:
Consciousness is a subjective experience. What “it is like” to perceive a scene, to endure pain,
to entertain a thought, or to reflect on the experience itself. When consciousness fades, as it does
in dreamless sleep, from the intrinsic perspective of the experiencing subject, the entire world
vanishes. Consciousness mainly depends on the integrity of certain brain regions and the particular
content of an experience depends on the activity of neurons in parts of the cerebral cortex (look at
fig 1
III. AC Origin
IV. AC technical development and difficulties
V. AC future
VI. Conclusion
Consciousness refers to your personal perception of your unique thoughts, memories, feelings, and environments. Essentially, your consciousness is your awareness of yourself and the world around you. This awareness is subjective and unique. From a neurological perspective, Science is still exploring the neural basis of consciousness. But even if we have a complete neuroscience picture of how the brain works or performs, many philosophers still believe that there is still a problem they call the "Consciousness Problem." The brain is the most complex organ in the entire universe as we know it. It has about 100 billion neurons. It has more neural connections than there are stars in the entire universe. This is why we are incredible beings who have a spark of consciousness. A popular discussed approach to achieve general intelligent models that can be conscious is whole depending on reaching a “brain simulation”. The low-level brain model is created by scanning and mapping the biological brain in detail and copying its state to a computer system or other computing device. Eventually, it is possible to indicate a condition that must happen to consider a machine as self-aware or conscious. A neural network must be at least as complex as the human brain because less complex brains are not able to produce conscious thoughts. Actually, it will not produce any conscious thoughts. Scientists and technical now work on building a model as complex as the human brain. It is just a prediction. However, they believe that AC will reach human consciousness in 2029. Even this attempt failed, AC will reach human consciousness even in the far future.
VII. References
Abstract Eight decades ago, the first artificial blood purifier was invented. Today, in a world where spending twelve hours a week in treatment is not a viable option to many patients, the same bulky machines are still used. Fortunately, scientists have been vigilant, and the notion of developing a portable, reliable dialysis machine has been sought by many. In this paper, we first begin by analyzing the principle of Dialysis. Then we shed light on the technological innovations achieved ever since the first dialysis machine was mass-produced. The use of a high-flux membrane dialyzer, ultrapure dialysis fluid, and convection fluid has proved to greatly improve Dialysis. However, the difficulties still prevail. And so long as an efficient substitute can be found for the dialysate and proper healthcare can be given to patients at home, a portable dialysis machine is not going to be devised.
I. Introduction
The kidney is arguably one of the most important organs in the human body because it cleans the
human blood from toxic wastes, so when it loses its renal functions, the person can be exposed to
death. Hence, scientists have long hoped to invent a machine that simulates the function of the
kidneys to clean human blood. The first successful machine for human Dialysis was invented and
operated in the 1940s by a Dutch physician called William Kloff. Kloff came up with the idea of
developing a blood purifier when he saw a patient with kidney failure. Kloff became interested in
the possibility of artificial stimulation of kidney function to remove toxins from the blood of
patients with uremia, or kidney failure. Although only one person was successfully treated, Cliff
completed experiments to develop his design
II. The scientific premise behind Dialysis
The principles of Dialysis can be tied back to Thomas Graham’s discovery of diffusion. In his
first article on Gaseous diffusion, Graham proposed that the gaseous flow was proportional to its
density. He examined the escape of hydrogen via a tiny hole in platinum and observed that hydrogen
molecules were moving out four times more quickly than oxygen molecules. His tests were designed in
such a way that he could quantify the relative speeds of specific molecular movements. He also
observed that heat enhanced the speed of these molecular movements while increasing the force
that resisted the atmospheric pressure by a certain weight of the gas. Graham's numerical
calculations revealed that the velocity of flow was inversely proportional to the square root of the
densities. His law demonstrated that the specific gravity of gases could be assessed more precisely
than usual. He also remarkably noted that diffusive gas escapes faster in a compound. This paved the
way for the invention of the dialysis machine. In Dialysis, Blood flows by one side of a
semi-permeable membrane, and a dialysate, or special dialysis fluid, flows by the opposite side. A
semipermeable membrane is a thin layer of material that has holes of different sizes or pores.
Smaller solutes and fluid pass through the membrane, but the membrane blocks the passage of larger
substances. This replicates the filtering process that occurs in the kidneys when the blood enters
the kidneys, and the larger substances are separated from the smaller ones in the glomerulus
In general, there are pretreatment systems before dialysis devices, which deliver a high quality
of water according to appropriate requirements, (primarily reverse osmosis [RO]). Scientists believe
that the malfunctioning of pretreatment systems and the resultant poor feed water quality of the
dialysis instrument might be related to some tragic occurrences at dialysis centers. Minimum trace
element concentrations such as heavy metals in Dialysis can severely disrupt trace element
concentration in individuals with Dialysis. Elements such as aluminum, nickel, cadmium, plum,
and chromium must thus be taken into account in particular. The rise in nickel level, for example,
may lead to acute nickel poisoning. Aluminum also causes a disrupted balance of calcium phosphate
not just in dialysis patients, but also in brain and bone conditions. over a long-term period of
periods of time. Based on the above, reducing heavy metals in water is highly essential.
III. Technological Innovations in Hemodialysis
i. Online Monitoring Technologies
Dialysis Automation and Profiling has made the process safer for the patient and the care team,
reducing the un-physiologic incidences of human mistakes. Online Monitoring relies on the immediate
information of Parameters blood volume (BV), dialysate, conductivity, urea kinetics, and thermal
energy balance. The dialysis machine uses these measurements to apply automated actions to achieve
the body's standards, such as sodium and potassium modeling and temperature control which affect the
patient during or after the Dialysis.
ii. Effects of Automated sodium modeling
According to the received measurements, the machines decide to keep the current concentration or
change it; one of these measurements is the dialysate sodium concentration. The machine tends to
raise the dialysate sodium concentration to prevent intradialytic hypertension causing after
dialysis vicious harms; Increased thirst, Intradialytic Weight Gain, and Hypertension; keeping in
mind that fluid retention of ≥ 4 kg between two subsequent dialysis sessions is associated with a
higher risk of cardiovascular death.
iii. Effects of Automated potassium modeling
An analysis: part of the 4D study has been conducted which shows that a portion of high mortality is sudden death or abnormal cardiac rhythm, where:
The sudden shifts in the plasma potassium because of hemodialysis sessions can cause death in
arrhythmia-prone patients. The lower concentration dialysate potassium is used to remove the excess
potassium, being the necessary gradient.
iv. Effects of Automated Temperature Modelling
Temperature modeling has been experienced by modifying dialysate temperature via blood
temperature monitoring integration in the HD machines. The machine adjusts the dialysate temperature
between 34 &35.5◦ c according to the patient blood temperature of 37◦ c. which results in
cardiovascular stability during the HD treatment better than the normal dialysate temp.
IV. Purity of Dialysate and Dialysis Water
The water & concentrates used to produce dialysate and the dialysate are required to meet
quality standards to reduce the injury risk of HD patients due to the chemical and microbiological
contaminants that can be in the dialysate.
Intact bacteria V.S. Bacterial Products
In the dialysate, there are non-vicious contaminants like intact bacteria that can`t proceed the dialyzer membrane, and vicious bacterial products such as endotoxins, fragments of endotoxin, peptidoglycans, and pieces of bacterial DNA which can cross into the bloodstream causing chronic inflammation due to stimulation on mononuclear cells. The induced inflammatory state may be an essential contributor to the long-term sickness associated with HD.Preparation of Ultra-Pure Water
Studies have shown that tiny fragments of bacterial DNA can maintain a chronic inflammation in HD patients by prolonging the survival of inflammatory mononuclear cells.V. Hemofiltration & Hemodiafiltration
Hemodialysis is based on diffusion; exchanging solutes from one fluid to another through a semipermeable membrane along a concentration gradient. Even HD High-Flux Membranes don`t make a difference in the number of removed solutes because solute diffusivity decreases rapidly with increasing molecular size. Despite that, convection therapies such as Hemofiltration (HF) and Hemodiafiltration (HDF) can remove larger solutes. Convection requires large volumes of substitution fluid which is covered with online ultrafiltration of dialysate and sophisticated volume control systems to maintain fluid balance.
Hemodiafiltration
HDF using a high-flux membrane dialyzer, ultrapure dialysis fluid, and convection fluid is highly efficient. As studies results, the high-efficiency online HDF is associated with a 35% reduced risk for mortality. Also, Regular use of online HDF is associated with reduced morbidity as compared with standard HD.Hemofiltration
A comparative study has been made on High-flux HF with ultrapure Low-Flux HD, shows a
significant survival rate in HF compared with standard HF (78% V.S. 57% 3yrs follow-up). The study
has demonstrated inclusion and logistic problems associated with online monitored Hemofiltration.
VI. Difficulties facing the development of portable dialysis machines
Many obstacles have hindered the development of a smaller dialysis machine let alone a full-fledged wearable artificial kidney. The primary impediment has been the lack of an effective strategy to enable toxin removal without using substantial volumes of dialysate — a limitation that applies to both hemodialysis and peritoneal Dialysis.
VII. Conclusion
It cannot be denied that impressive technological innovations in the field of Dialysis have been introduced in the past few decades from the first machine has been invented until now. However, the translation of these technical achievements into hard clinical outcomes is more difficult to demonstrate but some innovations really had helped dialysis be better. Despite that, it is unlikely that any of the innovations will be used in the next few years as there aren`t enough studies that ensure the long-term safety of patients. Furthermore, the need for a caregiver at disposal will remain a must if an artificial kidney were to be introduced.
VIII. References
Abstract Nanodrug delivery systems (NDDSs) are drug delivery systems made of materials on the nanoscale which encapsulate active compounds which aim to treat certain conditions. They can be made of many different materials; however, hydrogels, polymeric nanoparticles, and carbon nanotubes have become one of the more prominent NDDSs in recent years. Each NDDS has properties specific to its material. This literature review will seek to establish which NDDS has the best abilities in terms of some specific general properties which can be observed in all DDSs, with the focus on hydrogels, polymeric nanoparticles, and carbon nanotubes. After analyzing the data and properties of each of the three materials, we found each one surpasses the others in one property that makes it unique. Thus, determining which of these specific NDDSs is the most effective in general is difficult, and they should be chosen based on what they would be utilized for in a specific circumstance.
Keywords: nano-drug delivery systems, hydrogels, polymeric nanoparticles, carbon nanotubes, biomedicine, toxicity, bioavailability, retention, biodistribution, biocompatibility, solubility, sustained release, administration routes, mechanical strengthI. Introduction
Nano drug delivery systems have recently become increasingly more studied for their potential.
Their main advantage is their improved bioavailability and specific drug delivery, which makes them
better suited for the treatment of some conditions than traditional drugs. Because of this, drug
delivery systems have become more sophisticated as they focus on a more controlled and targeted
release. This helps avoid the systemic release of the therapeutic substance. Bioavailability implies
the part of the drug in question which enters the circulation and is, therefore, able to
be absorbed and have an effect. The bioavailability of nanoparticles is generally improved due to
increased solubility or the mechanisms which allow for their passage through cell membranes.
II. Methodology
This paper is designed to critically evaluate the overall efficacy of 3 smart nanomaterials when used in Nanodrug Delivery Systems. Scientific information was carefully selected by the authors from numerous reliable sources. To facilitate this, Zotero was used to bookmark every source and its respective citation, which was then added to an extensive Bibliography. All citations and the Bibliography follow the IEEE citation styles due to its widespread usage in highly scientific research papers. By analytically assessing each of the 3 materials before rigorously comparing all of them together, the authors were able to fully describe each material in its own context before formulating further comparisons. The criteria for finding sources was restricted so that only primary sources are used – this action demonstrates that scientific information is not unaltered in phrasing nor the style of writing, as only the sources are located and used.
III. Materials
IV. Analysis
Hydrogels, polymeric nanoparticles, and carbon nanotubes all have characteristics that may be utilized to determine which nano-drug delivery method is optimal based on these properties. Sustained-release, administration methods, mechanical strength, customizability, retention, toxicity, biocompatibility, drug-carrying capabilities, and production cost are all factors to consider.V. Conclusion
We have analyzed the properties of the three nanodrug delivery systems’ materials and evaluated them to find the best material in terms of sustained release, administration routes, mechanical strength, customizability, retention, toxicity, biocompatibility, drug-carrying capacities, and production cost. In terms of biocompatibility and customizability, the three nano-drug delivery technologies addressed in this study surpass conventional medicines. Hydrogels have been shown to have the bestsustained release properties and the most versatility in terms of administration routes, which is a limitation of polymeric nanoparticles, for example. When compared to hydrogels and carbon nanotubes, polymeric nanoparticles exhibit higher retention and lower toxicity. Carbon nanotubes, on the other hand, have a high mechanical strength (which is a major drawback of hydrogels), are lightweight, and have an excellent drug-carrying capacity. All of the nanodrug delivery methods mentioned above have extremely high production costs. As a result, determining which of these specific NDDSs is the most suitable in general is difficult, and this should be chosen based on what they would be utilized for in a specific circumstance. As noticed from the analysis, every NDDS material is unique for a specific use. It will ultimately depend on the circumstance it will be used for and in. Therefore, determining which one is the “ultimate” material is, almost, an impossible task.
VI. References
Abstract Neurons are known to stay in G0 once they are mature. So, it isn’t expected for neurons to form tumors. But what if the cell starts to lose control during its differentiation process! The type of the tumor depends on the type and the location of the transformed cell. One of these tumors is neuroblastoma which infects from 700 to 800 humans in the United States annually. In that review, we will have a look at neuroblastoma. From the definition to the treatment passing by the causes, symptoms, clinical stages, and its different types, the various features that affect and are affected by the disease would be discussed.
I. Introduction
Neuroblastoma, which was first delineated within the 1800s, is the commonest cancer in babies and the third-most common cancer in children after leukemia and brain cancer. Around one in every 7,000 children is affected at some time; 90% of cases occur in children less than 5 years old, and it is rare in adults. Moreover, 15% of cancer deaths in children are due to neuroblastoma. Neuroblastoma (NB) is a type of cancer that originates in certain types of nerve tissue. It most often starts from the adrenal glands and then develops in the neck, chest, abdomen, or spine. Symptoms embrace bone pain, a lump within the abdomen, neck, or chest, or a painless bluish lump underneath the skin. Neuroblastoma happens because of a genetic mutation occurring throughout early development or because of a mutation inherited from a person's parents. However, environmental factors are found to be not involved. Diagnosis is based on a tissue biopsy, in which a small sample of the tissue is taken so it can be examined under a microscope. Sometimes, it may be found in a baby by ultrasound during pregnancy. During diagnosis, cancer has usually already spread. The cancer is assessed into low-, intermediate-, and high-risk groups based on a child's age, cancer stage, and what cancer looks like. Treatment depends on the severeness of the disease. It may include observation, surgery, radiation, chemotherapy, or stem cell transplantation. Low-risk disease in babies has good results with surgery or simple observation. However, in high-risk diseases, chances of long-term survival are less than 40%, despite aggressive treatment.
II. The Disease & Its Types
III. The Symptoms
40% of Neuroblastoma patients who present with clinical symptoms are under 1 year of age, less
than 5% with clinical symptoms are over the age of 10 years, and the rest are between 1 and 10 years
old.
IV. Diagnosis
The current criteria for diagnosis and staging of neuroblastoma are based upon INSS ( The
International Neuroblastoma Staging System), which was initially developed in 1986. The diagnosis of
NB can be made by either characteristic histopathological evaluation of tumor tissue or by the
presence of tumor cells in a bone marrow biopsy and elevated levels of urinary catecholamines
(dopamine, vanillylmandelic acid, and homovanillic acid).
V. The Causes
Neuroblastoma occurs when immature nerve tissues (neuroblasts) grow continuously without any
control. The cells become abnormal and continue dividing, forming a tumor. A genetic mutation (a
change in the neuroblast’s genes) causes the cells to grow and divide uncontrollably. Scientists
aren’t sure till now what causes the genetic mutation.
VI. The Treatment
VII. The Treatment
VIII. Conclusion
Although humans have reached a high level of progress in neuroscience and other different biological sciences, till now they could not solve some mysteries in them such as Neuroblastoma. We do know the disease, its symptoms, its stages, how to diagnose it, and even the most susceptible age group to it. But we do not know for sure what causes the gene mutation that leads to this disease or how to prevent being infected by it. Even the used treatment methods are just the traditional ones used for any other similar cancer diseases. And that means only one thing, what humans have achieved from progress in all fields of science is not enough, and there are always more and more things to reach and discover. So never give up on learning more, and applying what you have learned, so you could discover new things and reach a new status no one has reached before.
IX. References
Abstract Astrobiology is the study of life outside Earth. The quest for life beyond the earth requires an understanding of life and the nature of its supportive environments, as well as of the planetary and stellar processes. One of the first steps in searching for life outside Earth is to find an exoplanet that presents a supportive environment for all sides of life by various methods. When such a planet is detected, the scientific challenge is to determine how it can accommodate life through the great distances of interstellar space. Scientists evaluated and restricted the conditions of habitability at each of these stages in their research on Mars' terrestrial analogues, surface geochemistry, and the likelihood of organic and inorganic biosignature conservation. Studying these analog environments provides crucial information for a better knowledge of past and current mission results, as well as for the design, selection, and planning of future Mars exploration missions.
Keywords: Habitable zones, Mars, Life, Earth, Microorganisms, Habitability, Detection, BiosignaturesI. Introduction
When we hear the term Astrobiology the first thing that comes to our minds is space and extraterrestrial life; however, astrobiology is a scientific field that not only discusses extraterrestrial life but also focuses on the environmental factors that enabled life on Earth. The Universe contains an infinite number of planets; nevertheless, not all of them are habitable. Scientists refer to the area in which a planet lies and contain the factors necessary for life (like water and suitable temperature) as habitable zone. Habitable zones differ from a star to another due to several factors like stars’ sizes. Mars is inevitable when discussing astrobiology and extraterrestrial life. Mars, being the closest to Earth, was the first place that scientists looked for life. However, Mars isn’t a perfect planet and many obstacles faced scientists, the main are the absence of oxygen (due to its thin atmosphere), absence of liquid water and the wide temperature range. The focus of searching for life was in our solar system; however, there are many planets that lay beyond our solar system that can be habitable, these planets are called exoplanets. Detecting exoplanets has been a challenge to scientists because of their distance from Earth. Several methods for detecting exoplanets, the most important of which are radical velocity and transit methods, have been developed, and they all rely on AI. But how to predict life on a distant planet without visiting it? To address this problem scientists, use scientific models.
i. Factors that allowed life to emerge on Earth:
Many factors allowed life to emerge on Earth among them are water, atmosphere, ozone layer,
location in the universe, magnetosphere, the moon, distance from the sun, and temperature. We will
explain two of these factors in detail.
1) Water:
Water is the second most abundant molecule in space. Water ice is widely distributed in space
and can be observed by many telescopes. Distant galaxies have water proving that water was already
present in the early universe. Our solar system is rich in water in different places and forms such
as: in the poles of telluric planets (e.g. Earth and Mars) or small celestial bodies like comets
(contain a significant fraction of water). In addition, liquid water oceans may be present under ice
crusts of several moons of outer planets (e.g. Jupiter & Saturn).
II. Habitability of Mars
𝑃 = 𝜎 ∗ 𝐴 𝑇4
, where 𝑃 is the power emitted from the star and 𝜎 is Stefan–Boltzmann constant [6]. If the
power emitted from the star, which the planet absorbs, is assumed to be equal to the emitted power
of the planet, this assumption gives:
𝜎4𝜋𝑅𝑠2 ∗ 𝑇𝑠4 ∗
(𝜋𝑅𝑝2/4𝜋𝑑2) = 𝜎4𝜋𝑅𝑝2 ∗
𝑇𝑝4
(𝜋𝑅𝑝2/4𝜋𝑑2) is the radiated power from the star over a
spherical area (4𝜋𝑑2) that is absorbed by the crosssection area of the planet
(𝜋𝑅𝑝2)
III. From Earth to Mars
One source of life on Mars could be Earth. It's far feasible that sun winds placing, ejecting, and propelling microbe-laden dust and particles in the stratosphere and mesosphere, and microbes residing in rock ricocheted into space from Earth with the aid of meteor strikes, have again and again infected Mars and distinctive planets and the opposite is correct. Moreover, due to tropical storms, monsoons, or even seasonal upwellings of columns of air, microbes, spores, fungi, (at the side of the water, methane, and different gases) may be transported to the stratosphere and mesosphere wherein they may continue. As it was proposed, solar winds and photons must disperse place-borne organisms during the cosmos. consequently, it may be really assumed that microbes not handiest flourish in the troposphere, however, while lofted into the stratosphere and mesosphere many continue to be feasible and may then be blown into space by means of effective sun wind in which they can stay. Even although innumerable meteorites collapse upon striking Earth's top atmosphere, those at the least ten kilometers across will punch a hollow within the surroundings and hold their descent. at the same time as meteors this duration or massive strike the ground, tons of dirt, rocks, (microbes, fungi, algae, and lichens can be, too), and other debris may be propelled over 100 km above the planet and ejected into spacei. Life on Mars
Spacecraft that landed or crashed on Mars could also transfer life from Earth to Mars. for example, after sterilization, between 300 to 540 different colonies (typically) alongside thousands of organisms, including fungi, vegetative microorganisms, Bacillus, and coccigram-positive, and microorganisms of the genus Streptococcus and Corynebacterium Brevibacterium survived the outer space of Mars Vikings Landers and another spacecraft. of non-cultivated species, and the abundance of germs and fungi even growing inside the equipment, the number of survivors is unknown. Bacilli are still reproductive and tolerate long-term exposure to deep radiation in areas like Mars. Many species of Micrococcus have also escaped extinction by living sterilized and living on the Earth's crust, simultaneously with several traces of staphylococcus and Corynebacterium, tolerating similar conditions (Corynebacterium) created by Martian habitats and cannot be eliminated from space. Streptococcus maybe a few other species that oppose NASA's sterilization efforts and remained active after 30 months. As a result, microorganisms could have been present in all shipments to Mars. For instance, because it was detected using NASA's Ultraviolet Imager aboard polar spacecraft, a magnetosphere explosion was caused by coronal mass ejection (CME) sequences and strong solar rays. As a result, polar regions had enough pressure to push oxygen, helium, hydrogen, and other gases from the Earth's surface into the atmosphere. typically, the weight is around or three nano pascals. while CME hit, it jumped into 10nano pascals. therefore, it is expected that other nutrients in the air, mold, moss, and algae have arrived at Mars to replicate themselves.ii. Microorganisms on Mars
Many researchers have found that the proliferation of species, including microorganisms, algae, mold, and mildew can continue to exist in an artificial environment such as Mars. these survival rates increase when supplied with water or protected by rock, sand, or gravel. It was discovered that the Bacillus subtilis survived conditions of UV irradiation performed by Martian, while it is suggested that cyanobacteria collected on cold and warm islands survived “conditions like Mars including atmospheric composition, gravity, changing humidity (full and dry conditions) and strong UV rays.” It has been reported that six subspecies of the genus Carnobacterium collected in a permafrost building in northeastern Siberia - considered analogs of the Mars underground and nine additional species of Carnobacterium were present all capable of thriving and growing under conditions like Mars. In another case, four types of methanogens (Methanosarcina barkeri, Methanococcus maripaludis, Methanothermobacter wolfeii, Methanobacterium formalism) survived exposing low-pressure conditions. Cyanobacteria are also tolerant of Mars conditions. Akinetes exposed (dormant cells are formed by filamentous cyanobacteria) in outdoor conditions, including demolition periods, extreme temperatures (-80 to 80 ° C), and UV rays (325-400 nm), and showed high levels of efficiency in these places like Mars. Eukaryotes (fungi, moss) are also survivors. it was reported that microcolonial fungi, Knufia perforans and Cryomyces antarcticus, and Exophiala jeanelmei (a type of black yeast), survived, designed, and showed no evidence of subsequent stress prolonged exposure to conditions such as thermo-physical Mars. After developing the dried colonies of Antarctic cryptoendolithic black fungus Cryomyces antarcticus and exposure 16 months mimicking scenarios like Mars on Earth's space station determined that "C.C. The antarcticus was able to tolerate the combined stress of external variability substrates, space, and conditions such as Mars Survival, DNA, and structural stability. Martian world radiation is rated at "0.67 millisieverts a day”. This is much lower and deeper under radiation tolerance levels of various prokaryotes and simple eukaryotes, including resistant fungi radiation, doses up to 1.7 × 104Gy. Fungus, moss, and many microbes are species they are attracted to and thrive in highly radioactive environments. Mold and radiation tolerance bacteria will seek out and grow in radiation sources that act as a source of energy of metabolism. Or their DNA is damaged by radiation levels in addition to their tolerance levels, and they can easily fix these genes due to the genes with repair functionsiii. Lichens
iv. BIOMEX experiments
To do some BIOMEX experiments, many preflight checks had been completed to figure whether the chosen samples are able to face up to severe conditions close to space and Martian environments. After a group of experiments on Mars-like regolith and desiccation exams, the organisms were exposed to experiment Verification exams (EVTs) and medical Verification exams (SVTs). Among (EVTs) exams, vacuum, low-strain Mars-like CO2 surroundings, extreme temperature cycles from ways under zero to more than 40C, and UVC irradiation were applied. The (SVTs) experiments had been conducted inner hardware with conditions that approached the ones of the gap surroundings at the ISS. The results explained that the lichen C. gyrosa exhibits high resistance and survival capacity. Neither Martian atmosphere and surface UV climate combinations nor LEO vacuum conditions induce a significant decrease in the activity of lichen after exposure for 120 h. It is important to emphasize that in our experiment the lichen thalli were exposed to simulated Mars and real space conditions. Many the chosen archaea, bacteria, and heterogenic multilayered biofilms formed via many species had been observed to be the most resistant to simulated or direct space and Mars-like conditions. Less resistance and a considerably lower cellular number and power regarding the Mars-like surroundings were proven for multicellular lifestyles-forms inclusive of the examined fungus Cryomyces antarcticus and the lichens Buellia frigida and Circinaria gyrosa as some studies show that the test lichen survived the 30-day incubation in the Mars chamber particularly under niche conditionsIV. Machine learning methods
About 4600 million years ago our solar system was formed. We know this from the study of meteorites and radioactivity. But you have ever contemplated a question, "Are we alone?" Or "Is there life beyond the earth?", This is the topic of exoplanets which are planets beyond our solar system, these planets come in a wide variety of sizes and orbits. Some are ginormous planets, some are rocky, some are icy and hugging close to their stars. But how do scientists detect whether there are exoplanets? It's not a simple task to detect exoplanets. We may have imagined life in books and films on other planets for centuries, but it has been a recent phenomenon to detect actual exoplanets. Planets emit very little or no light on their own so subsection I details methods for detecting exoplanets. While subsection II analyzes the advantages and drawbacks of each of these methodsi. Exoplanets detection methods
Artificial intelligence (AI) and advanced vision technologies are being used to strengthen instrument capabilities and expand possibilities for detecting exoplanets.ii. The Radial Velocity Method
iii. Transit method
iv. Planetary detection Less prolific methods
Other methods for detecting exoplanets, in addition to radial velocity and transit photometry, including direct imaging, timing, and gravitational microlensing. In direct imaging, scientists use infrared imaging to examine the thermal radiation of exoplanets. Typically, we can only observe large, hot planets both close to and far from their stars, and imaging Earth-like planets necessitates high levels of optothermal stability. Timing methods take advantage of niche properties of certain exoplanets and can involve pulsars (neutron stars) that emit radio waves on a regular basis, changes in eclipsing binary minima that detect planets far from their host starv. Analysis and Comparison of Methods
V. Exoplanet Hunting: Using Machine Learning to Predict Planetary Atmospheric Chemical Reaction Rates
A Machine Learning Approach to Forecasting Weather on Mars
After all that, does this mean that Mars is unhabitable? A destructive Cosmic event could at any given moment completely erase all life on Earth. Extinction may be inevitable as well as pressure on earth's biosphere. Going beyond our domain, far beyond the sun and the Earth's planet, Mars can be a means of reducing the risk of human extinction. Despite this high goal, hostile Martian weather conditions vary considerably from those on earth and it would be inestimable for successful colonization to predict these conditions. In particular, the extremely wide temperature range (20°C to -73°C) is a significant barrier to implementing human infrastructure. Traditional weather prediction techniques used on Earth, such as numerical weather prediction (NWP), are extremely computationally intensive and, due to the volatile physical conditions of the Earth's atmosphere, are not always stable. Beyond that, NWP cannot be easily applied to forecast Martian weather. To overcome this barrier, supervised machine learning—a method resistant to an incongruity of atmospheric conditions that leads to uncertainties of NWP—is ideal for the Martian atmosphere, which is even less understood. The Mars' Gale Crater weather data has been collected and available via their Planetary Data System by a NASA Curiosity Rover. To predict a mean temperature using Curiosity's data two types of machine learning algorithms will be implemented: linear regression and artificial neural networks. These paradigms have been selected due to the ability of each to take into account the mixture of nonlinear and linear weather reactions. The weather data will be used in the two models of ~3 Mars years to predict ~1 year of test data. The medium and medium absolute errors are calculated, and the models are compared. for the mean temperature predictionVI. Conclusion
By identifying the past and potentially current environments in our solar system, detecting planets around other stars will be inevitable. In addition to understanding the origin of life, its evolution and diversification on Earth much more closely. Combined with the measurement of the host star's radial velocity, it can produce the planetary density that can be indicative of its development history. In the following years, the discovery of new transiting planets through land and space projects will increase understanding of these objects. Scientists are beginning to conduct comparative planetology and soon new insights will be provided into the genuine mass distribution of these objects. There is also a scope to improve chemical reaction rate prediction. It is humankind's nature to explore surroundings if it is possible. Spacecrafts were first sent to the planets forty years ago, and since then, the art of space exploration has become increasingly refined, and discoveries have multiplied. Theoretically, scientists now can reach and explore any object in the solar system. Mars is at the top of the list of exploration targets. Most hospitable, and most intriguing of the planets.
VII. References
Abstract In this article, we will examine a novel cancer treatment approach that involves the use of viral therapy. The main idea of the article is the use of HIV (which in normal circumstances targets T-cells) as a vector to transfer genetic information to a patient. HIV is the most common vector for the transfer of genetic information into a malignant cell, and it is also the most dangerous. These genetic components will assist the patient's T-cells in recognizing and damaging the malignant cells in his or her body. It may seem far-fetched to think about using a virus to treat cancer, yet it is a proven fact. When it comes to cancer treatment, HIV has proved its potential to deliver precise genetic information to the nucleus. This article will discuss the treatment's mechanism, the rationale for using HIV, specifically as a vector, and the CAR-T therapy, among other topics. Leukemia is the case of study of this article. Aside from that, it provides necessary foundation knowledge regarding HIV's behavior and binding mechanism as well as concerning viral vectors and CAR-T therapy.
I. Introduction
There were various types of leukemia therapy available many years ago. These therapies differ in their interaction with or targeting of cancerous cells. The kind of leukemia, the patient's age and health state, and whether or not the leukemia cells have moved to the cerebrospinal fluid will all influence treatment. Chemotherapy, for example, is the most common method of treatment for leukemia. Chemicals are used in this medication therapy to destroy leukemia cells. As well as being given orally through pills, chemotherapy can also be administered intravenously through a catheter or intravenous line. Another treatment used is biological therapies that assist the immune system in identifying and attacking aberrant cells. Antibodies, tumor vaccines, and cytokines are examples of biological treatments for cancer. However, instead of killing all quickly developing cells, targeted treatments are preferred to use. They interfere with a specific feature or function of a cancer cell. As a result, the targeted treatment causes less harm to normal cells than chemotherapy. Moreover, radiation treatment is another option for curing leukemia. It targets cancer cells with high-energy radiation. In addition to treating leukemia that has progressed to the brain, radiation therapy can also be used to treat leukemia that has collected in the spleen or other organs. There are other treatment approaches for leukemia, but these are the most important ones. Patients might get either one treatment or several medications depending on the kind and the severity of leukemia. In order to deal with leukemia, our research concerns on identifying leukemia and its types. Leukemia, cancer of the blood cells, affects myeloid and lymphoid in the bone marrow. Unlike other sorts of cancers that cause tumors with harmful effects in various parts of the body, leukemia cells travel freely in the bloodstream. Their harm, however, lies in their dysfunctionality. A huge amount of useless blood cells and platelets multiply constantly in the bone marrow, clogging it up and decreasing normal blood cells. The four main types of leukemia depend on which blood cell type they infect (myeloid & lymphoid), and the time of infection in the life-cycle of the blood cell (acute & chronic). Acute Myeloblastic Leukemia occurs in myeloblasts, marrow stem cells that specialize in RBCs, WBCs, and platelets. Acute Lymphoblastic Leukemia occurs in lymphoblasts, that specialize in T-cells, B-cells, and natural killer cells. The infected blasts lose their ability to mature and remain as useless oversized cells. Because of their size and number, they easily clog tissues upon infiltrating them, causing hepatomegaly, splenomegaly, pain in the bones, and others. A decrease in the number of normal cells, on the other hand, leads to anemia, frequent infections, hemorrhages, etc. Our approach is to use HIV as a based vector to treat cells. It might seem as madness to use a virus to treat cancer! It is really science madness. However, our findings have ensured that the there is a capability to use HIV as a treatment for leukemia.
II. HIV
HIV-1 and HIV-2 are members of the Retroviruses family, under the genus Lentiviruses. HIV (human immunodeficiency virus) is a virus that affects the immune system of the body. It is made up of two strands of RNA, 15 different types of viral proteins, and a few proteins from the previous host cell it infected, all encased in a lipid bilayer membrane. These chemicals, when combined, infect a kind of white blood cell in the body's immune system known as a T-helper cell (also called a CD4 cell). These important cells keep us healthy by defending us against infections and illnesses. From the earliest steps of viral attachment through the ultimate process of budding, each molecule in the virus plays a function in this process. HIV is incapable of reproducing on its own. Instead, the virus binds to and merges with a T-helper cell. It then takes over the cell's DNA, replicates itself within the cell, and eventually releases more HIV into the bloodstream. HIV is important in our study since it is the lentiviral vector that attaches to T-Cells and delivers the antileukemia transgene.
HIV structure
Mechanism of HIV binding
III. CAR-T Therapy & leukemia
Chimeric antigen receptors are specifically engineered to fit a certain antigen such that it
enables the T-cells to latch onto the targeted antigen. Consequently, for a given disease, CAR's
target antigen should behighly expressed in most cells/viruses that cause the disease while not
having much correspondence to normal tissues, saving them from severe damage
CAR T-cell therapy for HIV infections
According to UNAID: Acquired immunodeficiency syndrome (AIDS), a medical condition caused by the
HIV virus, has caused about 36 million deaths since it was discovered in 1980 and has a death count
of 680,000 in 2020 a reduction by 64% since the peak in 2004 and by 47% since 2010. 37.7 million
People are currently living with AIDS.
Antiretroviral therapy is the standard treatment for AIDS at the moment. The plasma HIV viral
load could be significantly lowered or perhaps undetectable if properly applied. Despite success,
new information suggests that cryptic viral replication still exists, immunological dysfunction is
common during treatment, and HIV could resurface after years of undetectable viremia. Furthermore,
antiretroviral therapy's side effects are considered to increase the chance of non-AIDS mortality.
As a result, the current therapeutic techniques' efficacy is far from sufficient, and led
to research of CAR Tcell therapy for HIV infections.
Immunotherapeutic methods to treat HIV infection have been hindered by features peculiar to HIV
infection, such as the high mutation rate of reverse transcriptase, which allows for the rapid
generation of immune escape mutated variants and viremia recurrence.
To destroy HIV-infected cells, first-generation antiHIV CAR methods used the CD4 receptor's
extracellular region as the targeting domain, along with the CD3ζT cell signaling domain. Later
research indicated that CD4-based CARs make gene-modified T cells vulnerable to HIV infection [9].
Several ways to improve HIV-specific CAR-T cells have been tested to address this constraint,
including the construction of tandem CAR-T cells, or CAR-T cells expressing a CD4 CAR in combination
with either a gp41-derived fusion inhibitor or CCR5 ablation. AntiHIV CARs have also been
reengineered with 4- 1BB or CD28 stimulatory signaling patterns to improve their in-body persistence
and potency when used in combination with soluble broadly neutralizing antibodies (bNAbs) that
identify nonredundant gp120/gp41 antigen epitopes
IV. Viral vectors
Types of viral vectors and their mechanism
HIV as a viral vector
V. The mechanism of the treatment
In this section, we explain why HIV is used in this process. In addition, we elaborate on the process of using HIV as leukemia treatmenti. Why HIV
First, the normality of HIV that targets the T-cells. HIV directly infects T cells via high-affinity interaction between the virion envelope glycoprotein (gp120) and the CD4 molecule. The infection of T cells is assisted by the T-cell coreceptor called CXCR4 while HIV infects monocytes by interacting with the CCR5 coreceptor. Later than, the HIV gene transfers to the nucleus of the T-cell. Therefore, using HIV will be efficient to transfer the genetic material to the T-cell. Second, HIV is better suited to use as a vector. This is because Lentivirus-based vectors can infect nondividing and slowly dividing cells.ii. Genetic Materials
First, a sample is taken from the patient immune cells. Then, the gene for a special receptor that binds to a specific protein on the patient’s cancer cells is added to the T-cell of the patient. This receptor is called CAR T-cells. CAR T-cells are not normally found in the immune system. The protein finds in the body of foreign bodies helps the immune system to recognize these bodies. Each protein has a particular receptor that can bind to it. These proteins are called antigens. In other words, we can describe the relationship between antigens and immune receptors is like a lock and key. Just as a lock can only be opened with the right key, each foreign antigen has a unique immune receptor that binds to it. Since different cancers have different antigens, each CAR is made for specific cancer's antigen. For instance, in certain kinds of leukemia, the cancer cells have an antigen called CD19. The CAR T-cell therapies in order to treat these cancers are made to attach to the CD19 antigen and will not work for cancer that does not have the CD19 antigen.iii. Making the CAR T cells
After the white cells are removed, the T cells are separated and altered by adding the gene for the specific chimeric antigen receptor (CAR). This makes them CAR -T. Then, these cells are grown and multiplied in the lab. It can take several weeks to make the large number of CAR-T needed for this therapy.iv. Preparing HIV and Infection
HIV is in simplest terms a vector. By taking out all the viral RNA inside, HIV becomes harmless. HIV is then exposed to CAR-T cell and injects back to the patient. HIV start to target both the cancerous and not cancerous B-cells (lymphocytes that are responsible for making antigens in the body). Once the CAR -T bind with B cells, they start to increase in number and can help destroy even more cancer cells. T Cells then multiply and create a memory in the body to kill any future cancerous B-cells, preventing any type of leukemic cancer to form.VI. Conclusion
Viruses have positive aspects as they have negative aspects. We can use viruses in the treatment of deadly diseases such as Leukemia cancer (a type of white blood cells cancer). By new technologies such as viral thereby, we can use these viruses to introduce specific genetic materials to the human body. The viruses play the role of transportation HIV in normal target the T-cells (type of white blood cells). Therefore, HIV can be used to transfer the genetic material that the T-cells need to fight against the cancerous cells in the immune system. In our case of study, we are working on leukemia disease. In this case, the T-cells themselves are cancerous. To treat this type of cancer we need certain receptor that make the T-cell able to recognize the cancer cell by binding to it. This receptor called CAR-T. First, the T-cells are taken from the patient. Then, the CAR-T receptor is added to the T-cells in the laboratory. Finally, the modified T-cells exposed to HIV. After that the HIV is introduced to the patient. The modified T-cells become able to recognize the cancerous cell and fight them. In addition, T -Cells then multiply and create a memory in the body to kill any future cancerous B-cells, preventing any type of leukemic cancer to form. There are future fields of research and development that we recommend to be done. To begin, since the precise sequence of genetic material is not unknown, attempts must be made to ascertain it. Additionally, CAR-T treatment requires further development before it can be used on human cells. Moreover, because the method of HIV insertion is somehow vague, experiments on mice must be conducted prior to human use. Also, we recommend using other viral vectors like adenoviruses and herpes simplex virus to be used in other therapies. Overall, viral vectors appear to be the key to curing many diseases. Viral vectors will be used in several therapeutics in the next years.
VII. References