The history of the human immunodeficiency virus (HIV) and acquired immunodeficiency syndrome (AIDS) dates back to 1981, when homosexual men with symptoms of a disease that now are considered typical of AIDS were first described in Los Angeles and New York. The men had an unusual type of lung infection (pneumonia) called Pneumocystis carinii (now known as Pneumocystis jiroveci) pneumonia (PCP) and rare skin tumors called Kaposi's sarcomas. The patients were noted to have a severe reduction in a type of cell in the blood (CD4 cells) that is an important part of the immune system. These cells, often referred to as T cells, help the body fight infections. Shortly thereafter, this disease was recognized throughout the United States, Western Europe, and Africa. In 1983, researchers in the United States and France described the virus that causes AIDS, now known as HIV, belonging to the group of viruses called retroviruses. While HIV infection is required to develop AIDS, the actual definition of AIDS is the development of a low CD4 cell count (<200 cells/mm3) or any one of a long list of complications of HIV infection ranging from a variety of so-called "opportunistic infections," cancers, neurologic symptoms, and wasting syndromes.
Although the tests for detecting HIV infection continue to improve, they still require that people volunteer for testing. It is estimated that approximately 20% of those infected with HIV in the United States are unaware of their infection because they have never been tested. In order to decrease the number that are unaware of their HIV infection status, in 2006, the Centers for Disease Control and Prevention recommended that all people between the ages of 13 and 64 years be provided HIV testing whenever they encounter the health-care system for any reason. In addition, resources are available to facilitate people finding local HIV testing centers (http://www.hivtest.org).
What tests are used in the diagnosis of HIV?
In 1985, a blood test became available that measures antibodies to HIV that are the body's immune response to the HIV. The test used most commonly for diagnosing infection with HIV is referred to as an ELISA. If the ELISA finds HIV antibodies, the results must be confirmed, typically by a test called a Western blot. HIV antibody tests remain the best method for diagnosing HIV infection. Recently, tests have become available to look for these same antibodies in saliva, some providing results within one to 20 minutes of testing. Antibodies to HIV typically develop within several weeks of infection. During this interval, patients have virus in their body but will test negative by the standard antibody test, the so called "window period." In this setting, the diagnosis can be made if a test is used that actually detects the presence of virus in the blood rather than the antibodies, such as tests for HIV RNA or p24 antigen. Recently, a new test has been approved that measures both HIV antibodies and p24 antigen, shrinking the duration of the window period from infection to diagnosis. There also are many testing centers around the country that are routinely screening blood samples that are HIV-antibody negative for HIV RNA.Although the tests for detecting HIV infection continue to improve, they still require that people volunteer for testing. It is estimated that approximately 20% of those infected with HIV in the United States are unaware of their infection because they have never been tested. In order to decrease the number that are unaware of their HIV infection status, in 2006, the Centers for Disease Control and Prevention recommended that all people between the ages of 13 and 64 years be provided HIV testing whenever they encounter the health-care system for any reason. In addition, resources are available to facilitate people finding local HIV testing centers (http://www.hivtest.org).
No comments:
Post a Comment