Evaluating the credibility of one person's statements is difficult if not impossible, especially without knowing, for example, each person's background, training, affiliations, education, or experience. However, we feel that a guide to a person's theoretical expertise can be helpful, so we have built theoretical expertise ranking charts for each ProCon.org website to help differentiate the theoretical expertise of the various sources on our sites.
Our Rankings for Sources at Concealed Guns ProCon.org:
Government Reports and Peer-Reviewed Studies
Official vetted reports from international government bodies (such as the United Nations and the European Union), foreign governments (federal level agencies such as France’s Ministry of Justice, South Africa’s Ministry of Health, or Japan’s office of the Prime Minister), and US government agencies (state, federal, and quasi-government agencies including the Smithsonian Institution, the National Academy of Sciences, and Legal Services Corporation) and peer-reviewed studies from academic journals (such as Science, Nature, New England Journal of Medicine, etc.) tend to have multiple editorial and ideological filters, and they normally receive rigorous review from experts before being formally issued.
We do not believe that this issue has a Key Experts category. [Note: Key Experts definition varies by sites that have this designation.]
Individuals with PhDs, JDs, or equivalent advanced degrees in fields relevant to the concealed carry of handguns; and top-level federal government officials significantly involved in gun control and related issues. [Note: Experts definition varies by site.]
Media and Academic Journals
Mainstream print, broadcast, radio, and internet media entities such as the New York Times, CNN, ABC News, National Public Radio, Slate.com, Seattlepi.com, etc., as well as influential academic journals such as the Journal of the American Medical Association, Quarterly Journal of Economics, Foreign Policy, etc.
Individuals and organizations that do not fit into the other star categories.
Our theoretical expertise rankings are based upon the following premises:
The courts and many people equate a level of education and knowledge with a person's theoretical expertise.
Although ProCon.org doesn't have the resources to make complex evaluations of the expertise of each contributor to our website and such an evaluation would still contain a fair amount of subjectivity, we believe our theoretical ranking is more desirable than no ranking at all, and that it should be accurate at least 80% of the time.
We have therefore assigned a one through five star rating based on the theoretical expertise each of our sources should possess.
Additionally, we have customized the star categories to each of our sites' specific content because of their different subject matter.
Pros and Cons of Our Theoretical Expertise Ranking System:
Some people have praised our ranking system as a tool for differentiating the theoretical expertise of our sources. Others believe our star system should be eliminated because it is too subjective. The following chart sets out the main Pros and Cons about our theoretical expertise system:
Pro Theoretical Expertise System
Con Theoretical Expertise System
Provides a quick guide to theoretical source quality
Suggesting that one source is more theoretically expert than another reflects bias
Our star system is based on a person's education and knowledge - a concept used in the court system to try and distinguish reliable information
The number of stars prejudges sources for readers rather than allowing readers to judge the sources' expertise for themselves
Stars add a simple and clean visual aid for a source's theoretical expertise
Some 1 star individuals can be more expert than 3 or 4 starred sources. All organizations receive 1 or 2 stars regardless of their actual expertise
We estimate our star system is accurate roughly 80% of the time
We estimate our star system is inaccurate roughly 20% of the time
Some Flaws in Our Theoretical Expertise Ranking:
The following are examples of flaws or problems with our theoretical expertise rankings:
Government Reports - Some have questioned why we have chosen to give government reports our highest rating of five stars.
Our thinking is that government facts and statistics are generally reliable. However, what is less reliable, hence our lower rating, is when government personnel attempt to quote from such facts out of context, or worse when they misuse those facts on purpose or by accident.
For example, we generally would give our highest rating--five stars, to a government report saying that there have been 52,850 killed in auto accidents in a given time period, but we would consider a government employee who said in a speech "Fifty-thousand people died last year in auto accidents" to have less expertise. The government employee would probably receive one, three, or in some cases four stars, depending on the person's education and position.
Star Ranking Challenges at Organizations - We rank most organizations as 1-star because they are often dynamic and subject to myriad influences, thereby making a ranking difficult if not impossible.
Expertise Leap Between One-Star Organizations and Their Five-Star Reports - A government organization would be categorized as a one-star source, but a report issued by that organization would probably receive our five-star ranking.
No Background Verifications - We do not verify the professional or educational background of our sources. For instance, if someone reports that they hold a PhD, MD, or JD degree, then we presume that they hold that degree. We recognize that people may lie on their resumes, but we don't check resumes for accuracy.
Each Site Has a Different Scale - Election officials are 3-star "Experts" on the Voting Machines website, but probably wouldn't be on the Euthanasia website.
Some Expertise Categories Do Not Exist on Other Sites - The Born Gay website lists people with PhD's in Psychology as 4-star "Key Experts" while other sites may not even have a 4-star "Key Expert" category.
Theoretical Expertise Can Change with Events - For example, leaders of Hamas were 1-star sources on the Israeli-Palestinian ProCon.org site until their party won the March 2006 Palestinian elections, and now those same people are considered 3-star sources as government officers.
Determining What Is Mainstream Is Subjective - We classify media and academic journals as 2-star sources that are "mainstream" and "influential." We've done our best to determine what is "mainstream" and "influential" however such distinctions are in themselves subjective.
Doctors Get 4-Stars on Medical Marijuana Site Even if They Know Nothing About Medical Marijuana - On our medical marijuana site physicians receive a 4-star rating based only on their MD degree, with no consideration of their having had any experience with, or knowledge of, medical marijuana. A PhD research scientist with extensive medical marijuana experience receives a 3-star rating.
Some People with Great Actual Expertise Are Ranked as Having Low Theoretical Expertise - Some sources do have not advanced degrees or high-level government positions or other qualifications that would get them a 3 or 4 star ranking even though they may have vast experience and expertise in a subject. For example, Thomas Friedman is a Pulitzer Prize-winning writer on the Israeli-Palestinian conflict; however, since he does not have a PhD or hold a government office or otherwise meet the theoretical expertise criteria then he gets our 1-star ranking. Mary Lynn Mathre has a Masters degree in nursing and is a Registered Nurse who has worked closely for decades with patients who use medical marijuana yet she gets a 1-star ranking because she does not have an MD, PhD, JD, judgeship, or hold a government office. Norma Jean Almodovar is the Executive Director of a prostitution-related advocacy group who has worked as a call girl and a cop, yet she gets a 1-star ranking on our prostitution website because she does not have an MD, PhD, JD, or official government position.