Methodological Problems of the American Community Survey

  • Posted on: 17 November 2018
  • By: benfell

In neoliberal society, where workers are devalued, and under Theory X management generally, this is discouraged, but in any work situation, I'm driven to make improvements where I can. My impatience for idiocy reigns here.

So there I was in the field, working on the American Community Survey for the U.S. Census Bureau, knowing I was in no position to make things better. This was a frustration I foresaw as I went through the process of getting hired. But it's been over seventeen years since I last had a "real job," that is, one that enables a worker to support him- or herself with dignity. This job wouldn't be a "real job" either, but I hoped it might lead to one.

I began to discuss the ethical issues with this survey in my last post, but also hinted at methodological problems.1 In this post, I'll take up the methodological issues, which, by the way, also impact the ethical issues.

According to a media advisory sent out this month,

The American Community Survey provides a wide range of important statistics about people and housing for every community in the nation. This survey is the only source of local estimates for most of the 40 topics it covers for communities across the nation. For example, it produces statistics for language, education, commuting, employment, mortgage status and rent, as well as income, poverty and health insurance.2

The survey is long, and according to folks I talked to at the Bureau, one of the shorter ones in their repertoir. A pamphlet states the average length is 40 minutes, in training we were told the length generally ranges from 20 to 60 minutes, and my interim field supervisor told me to just say "it depends on your answers." This is much too long and my interim field supervisor's obfuscation impacts informed consent.

One big problem here is the mish-mash of topics. There is no single purpose to this survey; it's just a bunch of them piled on top of each other. No respondent can possibly keep all these purposes in their head, so even a happy and fully cooperative participant cannot truly be giving informed consent.

The mish-mash of topics also contributes to defects in the survey itself. For instance, at separate points in the survey, we ask about race and about ethnicity. Both of these are social constructs, related to each other, and not clearly distinguished from each other. For race, even though the Census Bureau does not view "Hispanic" or "Latinx" as a race, many folks identify as such, even when they have other countries of origin or the characteristics we often associate with race. Doing this right would require informing participants of what, exactly, the Bureau means by race, and what, exactly, the Bureau means by ethnicity and doing all this before we even ask the questions. And, of course, this sort of explanation would add to the length of an already-long survey. (I might add here that questions about gender assume a binary choice, which as we have seen increasingly in recent years, is untenable.)

The length also stems what often seem to be unnecessary or repetitive questions. For instance, families often have a single source of health insurance. Even when this is the case, the survey requires us to go through a list of sources of health insurance for each member of the household. We vocally ask about hearing difficulty; someone with such difficulty would have trouble hearing the question. We ask whether a child who has just bounded down a flight of stairs has difficulty walking or climbing stairs. Questions are quite often simply inapplicable.

Other questions require comprehension that neither the survey taker nor participant might possess. It was very rapidly clear to me, for example, that one participant was not in fact using mobile data with her smartphone; she relies on wi-fi and simply does not have Internet on her phone except where she can connect to wi-fi. But she wasn't clear on that and, I suspect, that many survey takers wouldn't be either (where discrepancies arise, survey takers can use probing questions to improve accuracy). Similarly, there are questions about financial instruments, both various kinds of loans and various kinds of investments, that I and probably most of my respondents are pretty hazy about.

In addition, I was deeply uncomfortable asking about citizenship, disabilities, reliance on the social safety net, and income generally. Many in our society judge people harshly for their answers to these questions, and people feel that judgment. Yes, we keep their answers confidential, but the simple fact is that it is an intrusion to ask.

In sum, this is a lengthy, complicated, and intrusive survey. Survey takers will feel pressure—I certainly did—to hasten the process, which may lead to leading questions. We're trained to avoid leading questions, but in practice, this is hard, even for someone like myself with far more training on this than the Census Bureau offers.

If I were in the god-like position needed to do what I always want to do, that is, to improve the situation, sweeping aside bureaucratic and legal obstacles, I think that as a very bare beginning, I would break up these surveys into smaller ones, affecting more participants. A few things happen here: First, one problem the Census Bureau has in gaining participation is that people associate the Bureau with the decennial census and suspect a scam when someone claiming to be from the Bureau shows up at any other time, for any other survey. Sure, I show identification, but how can anyone know that it's genuine? Census Bureau surveys need to be, albeit voluntary, more like jury duty. You might not be selected to participate for a long time, or you might get called every few years. But nearly everyone knows that people get called for jury duty all the time.

Second, breaking up the surveys into smaller units enables a coherent focus for each one. This would remedy the particular defect of informed consent I identified above. And third, of course, this reduces the length of time required for participation at a time. Fourth, this would also reduce the overall intrusiveness of each survey. I might be asking sensitive questions but I wouldn't be asking them about so many topics over an extended period of time.

Finally, I would seek to limit the mandatory function of the Census Bureau data collection to what is constitutionally required:

Representatives and direct Taxes shall be apportioned among the several States which may be included within this Union, according to their respective Numbers, which shall be determined by adding to the whole Number of free Persons, including those bound to Service for a Term of Years, and excluding Indians not taxed, three fifths of all other Persons. The actual Enumeration shall be made within three Years after the first Meeting of the Congress of the United States, and within every subsequent Term of ten Years, in such Manner as they shall by Law direct.3

That's a simple enumeration, which obviously should be adjusted for modern sensibilities: We count American Indians now and no one should be counted as "three fifths" of anyone else, even if in an institutionalized regime of social inequality this is the practical effect. This simple enumeration as required in the Constitution doesn't directly require any of the topics that the American Community Survey covers. Which is to say that the American Community Survey should be truly and explicitly voluntary, with the normal procedure of informed consent where having been informed of the purpose and the risks of participation, participants are told that their participation is voluntary, that they may decline to participate without penalty, and that they may terminate their participation at any time without penalty.