Patriot's Blog Archives for 2021-06

U.S. VETERANS SERVE AT HOME BY COMBATING FOOD DESERTS

U.S. VETERANS SERVE AT HOME BY COMBATING FOOD DESERTS 

 

The shuttering of three area Walmart stores forced residents in a 44 square mile swath of southwest Wichita, Kansas to live in a food desert. However through the partnership and support of the CDFI Fund, Enterprise Community Loan Fund and veteran- owned business Honor Capital, low-income families again have access to healthy food options and locally-driven economic opportunity. 

The U.S. Department of Agriculture (USDA) generally considers a food desert to be an area of low-income (at least 20 percent of residents living in poverty) and low-access to supermarkets or grocery stores (more than a mile in urban areas and more than 10 miles in rural areas). Due to a lack of healthy food options, those living in food deserts are disproportionately afflicted by nutrition-related illnesses, including diabetes and high blood pressure. 

Launched in 2014 by a group of U.S. Naval Academy graduates, Honor Capital’s mission is the eradication of food deserts in underserved rural areas and job creation for veterans. Building on experience opening three previous Save-A-Lot Stores, Honor Capital partnered with Enterprise to open its fourth store in this Wichita community. Tapping into CDFI Fund Healthy Food Financing Initiative (HFFI) funding, Enterprise provided Honor Capital with a loan to fund construction of the store. 

Opened on January 4, 2017, this store has expanded access to healthy food options for nearly 1,500 low-income families and seniors and created 12 new permanent jobs in a community with an unemployment rate of over five percent. 

“We are investing in Honor Capital because we share the same vision and see their great potential. Our investment will provide the capacity they need to continue to expand healthy food access in underserved communities and to create jobs,” says Enterprise Community Loan Fund President Lori Chatman. “We were able to partner with Honor Capital thanks to our HFFI award, and 

we expect the social return of our investment to grow as this veteran-owned business continues to defeat food deserts.”  

 

This will be the topic for today. I hope you enjoyed the blog and learned a few new facts, thank you for reading.

The TOP 10 Medical Advances in History

The top 10 medical advances in history

 

Throughout history, disease has been a subject of fear and fascination in equal measure. However, each revolutionary medical discovery has brought us a crucial step closer to understanding the complex mysteries of disease and medicine. As a result, we have been able to develop medicines and treatments that have been instrumental in saving millions of lives. 

Here’s a chronological list of the top medical advances in history so far:

Vaccines (1796)

It is difficult to pinpoint when vaccines became an accepted practice, mostly because the journey to discovery was long and complicated. Beginning with an attempt by Edward Jenner in 1796 to use inoculations to tame the infamous smallpox virus, the usefulness and popularity of vaccines grew very quickly. Throughout the 1800s and early 1900s, various vaccinations were created to combat some of the world’s deadliest diseases, including smallpox, rabies, tuberculosis, and cholera. Over the course of 200 years, one of the deadliest diseases known to man – the smallpox – was wiped off the face of the earth.  Since then, virtually all vaccines have worked using the same concept. That was until a new technology, called mRNA, came along and created game-changing possibilities for the future of healthcare. Its high effectiveness,?capacity for rapid development and potential for low?production costs was evident during the Covid-19 pandemic two separate mRNA vaccines were developed and approved for use in just a matter of months.  

Anaesthesia (1846)

Before the first use of a general anaesthetic in the mid-19th century, surgery was undertaken only as a last resort, with several patients opting for death rather than enduring the excruciating ordeal. Although there were countless earlier experiments with anaesthetic dating as far back to 4000 BC – William T. G. Morton made history in 1846 when he successfully used ether as an anaesthetic during surgery. Soon after, a faster-acting substance called chloroform became widely used but was considered high-risk after several fatalities were reported. Since the 1800s, safer anaesthetics have been developed, allowing millions of life-saving, painless operations to take place. 

Germ theory (1861)

Before the ‘germ’ theory came about, the widely believed theory was that disease was caused by ‘spontaneous generation’. In other words, physicians of the time thought that disease could appear out of thin air, rather than being air-borne or transferred via skin-to-skin contact. In 1861, French microbiologist Louis Pasteur proved through a simple experiment that infectious disease was a result of an invasion of specific microscopic organisms - also known as pathogens - into living hosts. This new understanding marked a significant turning point in how diseases were treated, controlled and prevented, helping to prevent devastating epidemics that were responsible for thousands of deaths every year, such as the plague, dysentery and typhoid fever. 

Medical imaging (1895)

The first medical imaging machines were X-rays. The X-ray, a form of electromagnetic radiation, was ‘accidentally’ invented in 1895 by German physicist Wilhelm Conrad R?ntgen when experimenting with electrical currents through glass cathode-ray tubes. The discovery transformed medicine overnight and by the following year, Glasgow hospital opened the world's very first radiology department. 

Ultrasound, although originally discovered many years before, began being used for medical diagnosis in 1955. This medical imaging device uses high frequency sound waves to create a digital image, and was no less than ground-breaking in terms of detecting pre-natal conditions and other pelvic and abdominal abnormalities. In 1967, the computed tomography (CT) scanner was created, which uses X-ray detectors and computers to diagnose many different types of disease, and has become a fundamental diagnostic tool in modern medicine. 

The next major medical imaging technology was discovered in 1973 when Paul Lauterbur produced the first magnetic resonance image (MRI). The nuclear magnetic resonance data creates detailed images within the body and is a crucial tool in detecting life-threatening conditions including tumours, cysts, damage to the brain and spinal cord and some heart and liver problems.

Antibiotics (1928)

Alexander Fleming’s penicillin, the world’s first antibiotic, completely revolutionised the war against deadly bacteria. Famously, the Scottish biologist accidentally discovered the anti-bacterial ‘mould’ in a petri dish in 1928. However, Fleming’s incredible findings were not properly recognised until the 1940s, when they began being mass-produced by American drug companies for use in World War II. Two other scientists were responsible for the mass distribution of penicillin, Australian Howard Florey and Nazi-Germany refugee Ernst Chain, and their development of the substance ended up saving millions of future lives. Unfortunately, over the years certain bacterium have become increasingly resistant to antibiotics, leading to a world-wide crisis that calls for the pharmaceutical industry to develop new anti-bacterial treatments as soon as possible.

Organ transplants (1954)

In December 1954, the first successful kidney transplant was carried out by Dr Joseph Murray and Dr David Hume in Boston, USA. Despite many previous attempts in history, this was the first instance where the recipient of an organ transplant survived the operation. The turning point came when various technical issues were overcome, such as vascular anastomosis (the connection between two blood vessels), placement of the kidney and immune response. In 1963, the first lung transplant was carried out, followed by a pancreas/kidney in 1966, and liver and heart in 1967. Aside from saving thousands of lives in the years following, transplant procedures have also become increasingly innovative and complex, with doctors successfully completing the first hand transplant in 1998 and full-face transplant in 2010! 

Antiviral drugs (1960s)

Terrible viruses such as small-pox, influenza and hepatitis have ravaged many human populations throughout history. Unlike the sweeping success of antibiotics in the late 1930s and 1940s, the development of antivirals did not really take off until the 1960s. This was mostly due to the structure of a virus, which was a core of genetic material surrounded by a protective protein coat that hides and reproduces inside a person’s cells. As the virus information is so protected, it was difficult to treat them without damaging the host cell. Over the years antivirals have improved significantly, and work by blocking the rapid reproduction of viral infections, and some can even stimulate the immune system to attack the virus. The development of effective antivirals has been significant in treating and controlling the spread of deadly virus outbreaks such as HIV/AIDS, Ebola and rabies.   

Stem cell therapy (1970s)

The incredible potential of stem cells was discovered in the late 1970s, when they were found inside human cord blood. Two specific characteristics make stem cells remarkable: they are unspecialised cells that can renew themselves through cell division even after being inactive, and under certain conditions can be used to make any type of human cell. This discovery has enormous potential and stem cell therapy has already been used to treat leukaemia and other blood disorders, as well as in bone marrow transplantation. Research is currently ongoing to use stem cells to treat spinal cord injuries and a number of neurological conditions such as Alzheimer’s, Parkinson’ and strokes. However, due to the ethical issues surrounding the use of embryonic stem cells, researchers are likely to face many obstacles when developing stem cell-based therapy.

Immunotherapy (1970s)

Immunotherapy, a treatment that stimulates the immune system to fight off a disease, has been in the making for over a century. The story began in the 1890s with the experimental work of William B. Coley who injected inactive bacteria into cancerous tumours, achieving remission in some patients. However, it is only in the last 40 years that serious progress has been made in immunotherapy, particularly in respect to treating cancer. In the 1970s, antibody therapies were developed and in 1991, researchers produced the first cancer vaccine which was approved by the FDA in 2010. In the last decade, immuno-oncology has become one of the most revolutionary cancer therapies in existence.  

Artificial intelligence (21st century)

Having been in gradual development since the turn of the century, artificial intelligence has already produced impressive technologies that have significantly altered the healthcare landscape. Life science companies and research institutions are teaming up with pioneering technology giants such as Google, IBM and Apple to invent smarter and faster ways to deal with diseases. These innovative technologies range from diagnostic tools that can detect malignant tumours invisible to the naked eye, to cognitive computing systems that produce tailored treatment plans for cancer patients. The potential of artificial intelligence in detecting, diagnosing and treating disease is rapidly unfolding before us and looks set to transform the future. 

 

This will be the topic for today. I hope you enjoyed the blog and learned a few new facts, thank you for reading.

DESTROYING The Welfare State

Thomas Sowell DESTROYS The Welfare State In One Sentence

 

Black Americans still face a struggle in American society that some people do not recognize. While the United States is the land of opportunity for all regardless of race, many black Americans find getting out of poverty a nearly impossible task. It is an incredibly sad situation, but one which has clearly identifiable roots.

While the Left will claim that there is a massive system of racist repression throughout our society that black Americans need to overcome, the real issue is not something as simple as the legacies of slavery or Jim Crow. While those institutions are blights upon our history, those factors do not directly affect the everyday lives of black Americans in the same way that another institution does currently.

That institution, as discussed by economist Walter Williams, is the welfare state, and specifically its legacy of destruction on the family unit.

It’s no secret that crime rates are higher among black Americans; it is also no secret that the vast majority of births in the black American population happen outside of wedlock, 75% to be exact. But why is this the case?

The welfare state has systematically preyed upon the family unit of black Americans since its inception. The way the system is structured financially incentivizes single parent households, and organizations like Planned Parenthood have also done their part to rip apart the fabric of the family as well.

The numbers are simple staggering, as Williams notes:

“The No. 1 problem among blacks is the effects stemming from a very weak family structure. Children from fatherless homes are likelier to drop out of high school, die by suicide, have behavioral disorders, join gangs, commit crimes and end up in prison. They are also likelier to live in poverty-stricken households. But is the weak black family a legacy of slavery? In 1960, just 22 percent of black children were raised in single-parent families. Fifty years later, more than 70 percent of black children were raised in single-parent families. Here’s my question: Was the increase in single-parent black families after 1960 a legacy of slavery, or might it be a legacy of the welfare state ushered in by the War on Poverty?”

The underhanded attack on the family over the past several decades has produced tangible results. While it is difficult to quantify the importance of the family, and exactly how its existence affects a child’s life, one can see the effects of is absence clearly just by looking at families that stick together vs. those who separate.

“The bottom line is that the black family was stronger the first 100 years after slavery than during what will be the second 100 years.”

Just let that thought sink in for a second. The black family unit was more intact during the years of systemic racism and oppression than it is now, after slavery and Jim Crow have largely been erased from our nation’s politics.

“At one time, almost all black families were poor, regardless of whether one or both parents were present. Today roughly 30 percent of blacks are poor. However, two-parent black families are rarely poor. Only 8 percent of black married-couple families live in poverty. Among black families in which both the husband and wife work full time, the poverty rate is under 5 percent. Poverty in black families headed by single women is 37 percent. The undeniable truth is that neither slavery nor Jim Crow nor the harshest racism has decimated the black family the way the welfare state has.”

The welfare state’s assault on the family unit has wrought devastation on black America. Children need to have both parents in their youth. Sometimes single parents can sufficiently raise a child, but on a systemic level, as a general rule, both parents need to be together, taking an active role in raising up their children.

The results of failing to do so are increased crime rates, decreased economic activity, and further breakdown of whatever semblance of the family that remains after several generations of assault.

“Then there’s education. Many black 12th-graders deal with scientific problems at the level of whites in the sixth grade. They write and do math about as well as white seventh- and eighth-graders. All of this means that an employer hiring or a college admitting the typical black high school graduate is in effect hiring or admitting an eighth-grader. Thus, one should not be surprised by the outcomes.”

Williams concludes by saying that the idea that slavery and segregation have created these problems only perpetuates what we currently see. The politicians who support government policies like the War on Poverty and labor laws backed by unions only hold down black Americans on the lowest rung of the socio-economic ladder; right where they can prey on them for votes, handing out false promises that will never be fulfilled.

It’s truly saddening to see an entire population of Americans being repressed in such an underhanded way, and the means of repression are not in the form of slavery or Jim Crow laws, but in the form of an attack on the last thing upon which black Americans could rely: the family.

 

This will be the topic for today. I hope you enjoyed the blog and learned a few new facts, thank you for reading.

A Growing Cancer On Congress

A Growing Cancer On Congress: The Curse Of Party-Line Voting

Just as White House Counsel John Dean famously proclaimed the Watergate cover-up of the 1970s a “cancer on the presidency,” there is now a growing cancer on Congress.  The rapid and pervasive rise of party-line voting is a cancer that is eating at the effectiveness of both the House of Representatives and the Senate.  As a consequence, what was once the world’s most deliberative body, the US Senate, hardly deliberates at all, and what little is accomplished in Washington is done through party-line votes and executive orders, with devastating consequences.

The recent tax reform bill is Exhibit A, with zero Democrats voting for it in either the House or the Senate.  One Republican in the Senate and 13 in the House broke ranks to vote against it, largely out of a concern over its predicted increase in the federal debt.  With only one party at the table working on the bill, its provisions were developed last minute, with handwritten edits presented on the floor.  Deliberation, if it happened at all, was limited to one side of the aisle and a very narrow range of choices were considered in a short time frame.

Unfortunately party-line voting has become the new normal.  As recently as the early 1970s, party unity voting was around 60% but today it is closer to 90% in both the House and Senate.  If you think about the major legislative accomplishments of recent presidents, beginning with George W. Bush, you can see the problem.  Campaigning for the presidency by touting his work across the aisle as governor of Texas, Bush found that more difficult in Washington.  In his first year as president, Congress passed his No Child Left Behind education bill with strong bipartisan support, 384-45 in the House and 91-8 in the Senate.  But his next major legislation, prescription drugs for seniors, was hotly debated and the vote came largely on party lines, at least in the House, with only 8 Democrats supporting it and 8 Republicans against.

Part of Barack Obama’s “hope and change” message as a candidate included making Washington work in a bipartisan way, but that got little traction.  The Affordable Care Act, perhaps the most important piece of domestic legislation in 50 years, was passed on a straight party-line vote of Democrats.  Bipartisanship completely fell apart when Senate Majority Leader Mitch McConnell said Republicans’ “single most important thing” was making sure Obama was a one-term president, and Obama announced that he had “a pen and a phone” and would just take executive action to get things done.

Now we are shocked when Senator John McCain flies back to the Capitol from cancer treatments to announce he would not vote to repeal and replace Obamacare without a bipartisan conversation involving both parties to find the best solution.  An opinion piece in the conservative Washington Times called him “a traitor to the conservative cause.”  Apparently party discipline is more important than finding the right solution to the complex set of health care issues.

One unfortunate consequence of all this party-line voting and executive action is that policy swings back and forth or is held in the balance.  Obamacare is passed on a party-line vote and nearly repealed on one.  The same is true for Dodd-Frank.  Obama’s executive orders are simply overturned by his successor Donald Trump.  Is this any way to run a government?

One underlying problem is that the two major parties are now better sorted than before.  Whereas both the Republican and Democratic parties had some liberals, moderates and conservatives in an earlier day, now Republicans are predictably conservative and Democrats are liberal.  But another problem is that all politicians seem to care about in Washington is how a vote will best position them and their party for the next election, rather than what will make for a great piece of legislation.  Congress has devolved to marketing and winning, not deliberation and great policy.

Only when a few statesmen and the American voters stand up against party-line voting will anything change.

 

This will be the topic for today. I hope you enjoyed the blog and learned a few new facts, thank you for reading.

The Problems With FISA Part 3

Lacking Congressional Oversight

In practice, congressional oversight of the FISA process and the underlying materials is severely constrained. Although they have security clearances by virtue of their office, many lawmakers are kept far away from classified documents because they do not have cleared staff to assist in processing the information, and their requests are given lower priority than members of the intelligence oversight committees.

Even members of those House and Senate intelligence committees do not always have access to everything. In the case of the Nunes memo, only the “Gang of Eight” congressional leaders and a handful of others out of the 435 members of the House of Representatives and the 100 members of the Senate reportedly had access to the underlying FISA surveillance applications and un-redacted FISC opinions.

This problem has restricted Congress members before. In 2003, when then-House intelligence committee chairman Jay Rockefeller learned of the NSA’s unconstitutional spying programs under President George W. Bush, he had little capability to fight back. He wrote to then-Vice President Dick Cheney:

“As you know, I am neither a technician nor an attorney. Given the security restrictions associated with this information, and my inability to consult staff or counsel on my own, I feel unable to fully evaluate, much less endorse these activities."

Rockefeller—who knew of the programs—could not speak of them. For everyone else, reading FISA and FISC materials is close to impossible. Even after Congress passed the USA FREEDOM Act in 2015 requiring that significant FISC Opinions be released to the public, these opinions are still highly redacted and tightly guarded, and no FISA application material has never been revealed to the public.

It’s for these reasons that EFF has long called for Congress to reform how it oversees surveillance activities conducted by the Executive Branch, including by providing all members of Congress with the tools they need to meaningfully understand and challenge activities that are so often veiled in extreme secrecy.

 

This will be a four part series, this is part three of four on the topic for today. I hope you enjoyed the blog and learned a few new facts, thank you for reading.