RSS

Surveys

The survey activity generated data that answer questions beyond what the inventory can provide and served as integral links to our in-depth studies. The data also tells us more about the accessibility, use, non-use, and perceived benefits of public access venues. Surveys were administered to representative samples of public access venues and populations (both venue users and non-users). We conducted surveys in Bangladesh, Brazil, Chile, Ghana, and the Philippines.

Overview
Survey research questions
Survey scope
Survey instruments
Sampling strategy
Survey timeline
Survey data

Survey overview

The surveys conducted as part of the Global Impact Study provide four main benefits to the overarching goals of the project. The surveys:

  1. Serve as integral links to the in-depth studies
  2. Tell us more about the accessibility, use, and perceived benefits of public access venues
  3. Inform us about reasons for non-use
  4. Provide an insight into the potential magnitude of impacts of public access ICTs when viewed in the context of the inventory data

The survey activity consists of three distinct surveys: 1) public access ICT venue users, 2) public access ICT venue operators, and 3) public access ICT non-users. The main purpose of the user survey is to gather information on users’ characteristics, usage patterns, and perceived impacts of using public access ICTs. The venue operator survey is designed to gather information on the operational characteristics, design, services, and costs of providing public access to ICTs. The main purpose of the non-user survey is to gather information on the characteristics of non-users and reasons for not using public access ICTs.

A total of approximately 1,000 users, 400 non-users, and 250 operators of public access ICTs were surveyed around approximately 250 randomly-selected public access locations throughout each country. Selection of the venue sample was determined using the inventory of public access venues in the country and input from the selected research teams. All user and operator surveys were researcher-administered and will take place at the venues and were designed to take 45-60 minutes to complete. The non-user surveys was conducted in communities surrounding a subset of the venues sampled in the user and operator surveys. The non-user survey was designed to take approximately 30 minutes.

Back to top

Research questions

To create the surveys, the Survey Working Group prioritized a set of research questions to investigate. A corresponding set of hypotheses was developed to go with each research question, as were potential indicators to test each hypothesis. In the end, the Survey Working group decided upon 14 research questions:

  1. What is the demographic profile of public access ICT users and non-users?
  2. Apart from public access ICTs, what other information and communication resources do public access ICT users and non-users have?
  3. What are the ICT skills and ICT use comfort levels of public access users?
  4. Why do people go to public access ICT venues?
  5. What are the reasons for non-use of public access ICT venues?
  6. What do people do at public access venues?
  7. How accessible are public access ICT venues and services to different types of populations?
  8. How do the design, services, and operations of public access ICTs affect usage patterns?
  9. What do public access users see as the impacts of using public access ICTs?
  10. What outcomes can be associated with public access ICT use in different domains?
  11. Are the outcomes non-users experience from use of non-public access information and communication resources similar to the outcomes users of public access ICTs experience?
  12. Does public access ICT use have indirect impacts?
  13. What is the value of public access ICTs to users?
  14. What is the cost of providing public access ICTs?

Back to top

Survey scope

For the purposes of this survey (and the project in general), we have defined “public access” to describe only facilities with substantial and/or visible ICT presences. In addition to traditional cybercafés and telecentres, such places may also include coffee shops with large numbers of computers connected to the Internet. In this case, although delivery of ICT is not the primary service of the facility, there remains a substantial ICT presence. However, a restaurant with one computer in a corner would fall out of the scope of this project as ICT service provision would not be very substantial at the facility.

The term “public” in “public access” refers to the characteristic of venues that are open to the public and do not have restrictions on who can use them. “Public” does NOT refer to a venue’s legal status or source of funding (i.e., it does not indicate governmental support).

When we discuss the presence of ICTs at a venue, we are referring to computers and the internet. This covers venues that have computers without internet connections as well as those that have computers with internet connections.

Back to top

Survey instruments

User survey

The user survey instrument is divided into six sections:

  1. Basic information
  2. ICT exposure and skills
  3. Access
  4. Usage
  5. Perceived impacts
  6. Demographics

Descriptions of each section:

1. Basic information: This section asks the interviewer to input general information about the survey conditions, including the location, privacy of the interview, and the interview language.

2. ICT exposure and skills: This section captures data on users’ prior exposure to computers and the Internet as well as their skill level and comfort with using these technologies. The section also collects data on respondents’ other information and communication resources.

3. Access: This section is designed to capture data on the types of venues users frequent, the costs associated with visiting venues, user visitation patterns, and the factors that shape users’ decisions on which venue to patronize. The section is broken into three parts:

Public access in general: Questions here are geared to capture information about the general reasons why people use public access ICTs and why they choose certain venues.

The venue usually visited: This sub-section is designed to capture data on the venue the user usually visits, including frequency of use; duration of use; and method, costs, and time to travel to the venue.

This venue: This sub-section is similar to the previous one, but focuses on the venue at which the survey is being conducted.

4. Usage: In this section, users will answer questions primarily about the venue they usually visit. These questions relate to types of activities, frequency of performing those activities, interaction with other users, use of venue support staff, and willingness to pay for services.

5. Impact: This section captures data on the outcomes and impacts of public access ICT use in various development domains. Some questions capture data on users’ perceptions of impact, while others attempt to trace impact using indicators attached to specific domains.

6. Demographics: This section captures basic user demographics, including gender, age, education, occupation, and income level.

Venue survey

The venue survey instrument is divided into ten sections:

  1. About the venue
  2. About the venue layout
  3. Venue infrastructure
  4. Financing and costs
  5. Staffing
  6. Services
  7. Traffic and usage
  8. Changes at the venue
  9. Venue impacts
  10. Demographics
Descriptions of each section:

1. About the venue: This short section captures basic information about the venue and the interview setup, including verification of the venue contact information, taxonomy, and the privacy of the interview. Most of the information needs to be filled out ahead of time using the Global Impact Study inventory data and is verified once arriving at the venue.

2. About the venue layout: This section captures basic information about the venue layout, including the condition of the building, visibility from the street, wheelchair accessibility, and computer configuration. This section is also completed prior to beginning the interview and is based on the interviewer’s visual assessment of the venue.

3. Venue infrastructure: In this section, the respondent answers questions about the venue infrastructure, including types of computers, operating systems, and the Internet connection.

4. Financing and costs: This section captures information on the business model of the venue and costs and expenses associated with providing public access to computing.

5. Staffing: This section collects information about the venue staff, including the number of staff and staff characteristics.

6. Services: This section captures information on the types of services offered, fees for services, and rules and use restrictions.

7. Traffic and usage: In this section, the respondent provides information on the hours of operation, traffic patterns, user characteristics, and user behavior.

8. Changes at the venue: This section collects information on changes in the venue infrastructure and usage patterns since the respondent began working at the venue.

9. Venue impacts: In this section, the respondent provides his/her perception on ways the venue has impacted him/her, the users, and the community.

10. Demographics: This section captures basic venue operator demographics, including gender, age, education, occupation, and income level.

Non-user survey

The non-user survey instrument is divided into five sections:

  1. Household screening and information
  2. ICT exposure, skills, and usage
  3. Perceived impacts
  4. Cost valuation
  5. Demographics

Descriptions of each section:

1. Household screening and information: This section asks the interviewer to screen the household to:

a.   Assess whether or not they qualify for the survey

b.  Select respondents based on age and gender

c.  Identify the appropriate sections of the survey for the respondent. Four types of respondents are distinguished for this survey. Each answers a different section of the    instrument:

i. Never public access user, Non-computer user
ii. Never public Access User, Computer user
iii. Ex-public access user, Computer user
iv. Ex-public access user, Non-computer user

2. ICT exposure, skills, and usage: This section captures data on peoples’ prior exposure to computers and the Internet as well as their skill level and comfort with using these technologies. The section also collects data on respondents’ other information and communication resources, as well as types of activities performed.

3. Perceived impacts: This section captures data on the indirect outcomes of public access ICT use, as well as the direct impacts of non-public access ICT use in various development domains. Some questions capture data on peoples’ perceptions of impact, while others attempt to trace impact using indicators attached to specific domains. The section has two parts: one to solicit answers from computer users about the outcomes of general computer/internet use; the second to solicit answers from non-computer users about the outcomes of their use of other (non-computer-based) resources to meet their information and communication needs.

4. Cost valuation: This section uses the contingent valuation method to collect data to estimate the value non-users place on the availability of public access ICT venues.

5. Demographics: This section captures basic demographics, including gender, age, education, occupation and income level.

Download survey instruments

The three survey instruments, as well as the survey codebooks, can be found in our web library.

User survey

Venue survey

Non-user survey

Back to top

Sampling strategies

User survey sampling strategy

Venue survey sampling strategy

Non-user survey household sampling strategy

Non-user survey respondent sampling strategy

Back to top

Survey timeline

December 2009 – January 2010

Distribution of first drafts of user and venue surveys to the project team for feedback; user  and venue surveys revised

January – July 2010

Development of individual country venue and user sample selection strategies

February – March 2010

Distribution of user and venue surveys to the project team for comment; user and venue surveys revised

April 2010

Translation and formatting of surveys in preparation for testing

May 2010

Cognitive testing and report writing in Bangladesh, Brazil, Chile, and the Philippines; user and venue surveys revised based on report data

June – July 2010

Field testing and report writing in Bangladesh, Brazil, Chile, and the Philippines; user and venue surveys revised based on report data

August – December 2010

User and venue survey implementation

March 2011 – May 2011

Non-user survey implementation

May 2011 – August 2011

User, venue, and non-user survey implementation in Ghana

January 2011 – October 2012

Survey data analysis

Back to top

Survey data

The user, venue, and non-user survey data can be found in the Global Impact Study Web Library.

Back to top