Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
261 Cards in this Set
- Front
- Back
6 categories of survey questions
|
1) Behaviour **
2) Attitudes/beliefs/opinions 3) Characteristics 4)expectations 5) Self-classification 6) Knowledge |
|
in survey research, he or she begins with ___________ and ends with _______
|
a theoretical or applied research problem, empirical measurement and data analysis
|
|
interview schedule
|
a set of questions read to a respondent by an interviewer who also records responses
|
|
a _____ should be conducted before the final surveys are distributed
|
pilot test
|
|
3 principles for effective survey questions
|
1) keep it clear
2) keep it simple 3) keep the respondents' perspective in mind |
|
12 things to avoid when writing survey questions
|
1) Avoid jargon/slang/abbreviations
2) avoid ambiguity/confusion/vagueness 3) Avoid emotional language 4) Avoid prestige Bias 5) Avoid double-barrelled questions 6) Do not confuse beliefs with reality 7) Avoid leading questions (loaded questions) ** 8) avoid asking questions that are beyond respondents' capabilities 9) Avoid false premises ** 10) avoid asking about intentions in the distant future 11) avoid double negatives 12) avoid overlapping or unbalanced response categories |
|
What is prestige bias?
|
something you should avoid when writing survey questions because if you associate a question with a prestigeous group/person the respondent may answer on how they feel about that person, not the actual issue
|
|
in surveys, people are likely to underreport three things:
|
having a illness or disability, engaging in illegal or deviant behaviour, or revealing their financial status.
|
|
social desirability bias
|
occurs when respondents distort answers to make their reports conform to social norms (survey research). We try to reduce this by writing questions so that they seem to accept more behaviours, or offering face saving alternatives
|
|
a skip/contingency question
|
a two(or more) part question. the answer to the first part of the question determines which of two different questions a respondent receives next
|
|
probes
|
following up on a person and learning about their answer to a closed-ended response
|
|
partially open questions
|
a set of fixed choices with a final open choice of "other"
|
|
for large scale surveys, what kinds of questions are often used in pilot tests
|
open ended
|
|
standard-format question
|
does not offer a "dont know" choice
|
|
quasi-filter question
|
offers respondents a "dont know" alternative
|
|
full-filter question
|
asks if a respondent as an opinion, and then asks for the opinion of those that say they have one
|
|
floaters
|
they are those who lack a belief or opinion but give an answer anyway, if asked. their answers are often inconsistent
|
|
response set bias
|
when respondents tend to agree with every question in a series rather than thinking through their answer to each question
|
|
is rankings or ratings better
|
ranking is better because it can be put into a hieracrchy.. rating allows people to rate everything the same
|
|
who is most influenced by minor wording differences on surveys
|
less educated respondents
|
|
Hunter
|
found that using the word "poor" on a survey is better than using "welfare"
|
|
3 issues regarding survey question order/sequence
|
1) organization or questionnaire
2) Order effects 3) Context effects |
|
context effects
|
in survey research, an effect when an overall tone or set topics heard by a repondent influences how they interpret the meaning of subsequent questions (use funnel questions to reduce this)
|
|
matrix question
|
a survey question in which a set of questions is listed in a compact form together, all questions sharing the same set of answer categories
|
|
What is the cheapest type of survey
|
mail/self administered
|
|
what 4 type of questions do poorly in mail questionnaires?
|
questions using visuals, contingency questions, open ended questions, and complex questions
|
|
which type of survey has the highest response rates and permits the longest questionnaires?
|
face-to-face
|
|
probes
|
in survey research, a neutral request to clarify an ambiguous answer, to complete an incomplete answer, or to obtain a relevant response. (3-5 second pause.. or nonverbal communication)
|
|
4 ethical issues with surveys
|
1) the invasion of privacy
2) voluntary participation 3) exploitation of surveys and pseudosurveys.. some people use them to mislead others 4) when people misuse survey results or use purposely rigged surveys |
|
pseudosurvey
|
when someone who has little/no real interest in learning info from a respondent uses the survey format to try to persuade someone to do something
|
|
2 motivations for using probability/random sampling
|
1) saves money and time
2) accuracy |
|
7 types of nonprobability sampling
|
1) Haphazard
2) Quota 3) Purposive/judgement 4)snowball 5) deviant case 6) Sequential *** 7) theoretical |
|
quota sampling
|
nonprobability. The researcher first identifies general categories of people, and then decides a predetermined number of cases to get in each category. after that, he is back to using haphazard
|
|
purposive/judgement sampling
|
nonprobability. a researcher uses judgement in selecting cases with a particular purpose in mind. appropriate in three situations: 1) when used to select unique, informative cases, 2) difficult to reach/specialized populations, 3) wanting to identify particular types of cases for in-depth investigation
|
|
snowball sampling
|
nonprobability. method of selecting like cases in a network.
|
|
sociogram
|
a diagram of circles connected with lines to show interdependent networks
|
|
researchers often use snowball sampling in combo with _____
|
purposive/judgement sampling
|
|
deviant case sampling
|
nonprobability. researcher seeking out cases that differ from dominant patters.
|
|
sequential sampling
|
nonprobability. The researcher continues to gather cases until the amount of new info/diversity of cases is filled, or resources/time/enegery has been exhausted
|
|
theoretical sampling
|
nonprobabiltiy. tied to grounded theory. it is an ITERATIVE **sampling technique in which the sample size is determined when the data reach theoretical saturation
|
|
iterative
|
cyclical relationship between sampling, data collection, and data analysis
|
|
theoretical saturation
|
a term associated with the grounded theory approach that refers to the point at which no new themes emerge from the data and sampling is considered complete
|
|
sampling element
|
the name for a case or single unit to be selected
|
|
target population
|
the large general group of many cases from which a sample is drawn and which is specified in concrete terms
|
|
sampling ratio
|
the ratio of the size of the sample to the size of the target population
|
|
populations are ____ concepts
|
abstract
|
|
sampling frame
|
a list of cases in a population, or the best approximation of it. sampling frames are almost always a bit inaccurate
|
|
population parameter
|
any characteristic of a population
|
|
sampling error
|
how much a sample deviates from being representative of a population. sampling error CAN be calculated in probability sampling
|
|
5 types of probability sampling
|
1) simple random
2)systematic 3) stratified 4) cluster 5) Random-digit-dialing |
|
simple random sampling
|
probability. a researcher creates a sampling frame and uses a pure random process to select cases. each will have an equal probability of being selected. a random numbers table is often used
|
|
unrestricted random sapling
|
sampling with replacement
|
|
sampling distribution
|
a distribution created by drawing many random samples from the same population. it suggests that over many random samples, the true population parameter will be more common than any other result
|
|
central limits theorem
|
as the number of different random samples in a sampling distribution increses, the pattern formed is normal, and the centre is equal to its population parameter
|
|
confidence interval
|
a range around a specific point used to estimate a population parameter. they allow a researcher to say with a high level of confidence that the true pop. parameter lies within a certain range
|
|
systematic sampling
|
the same as simple random, with a shortcut for selection. a researcher selects ever kth case in the sampling frame using a sampling interval. not appropritate for cyclical data
|
|
stratified sampling
|
a researcher divides the population into strata, and then draws a random sample from each subpopulation. this is often used when a stratum of interest is a small portion of the population and therefore could have easily been missed if only random processes are used
|
|
cluster sampling
|
addresses the issue of dispersed populations. it has multiple stages in which units are randomly selected and then samples are drawn from the clusters. a researcher must decide the number of clusters and the number of elements within a cluster
|
|
proportionate cluster sampling
|
when all clusters are the same size. this is uncommon
|
|
probability proportionate to size (PPS)
|
used in cluster sampling to make sure each sampling element will have an equal probability of being selected in its relation to size in the population. therefore, each cluster is not the same size
|
|
random digit dialing
|
the researcher identifies active area codes, and exchanges, and then randomly selects four digit numbers. the sampling element is a phone number, and the population is all numbers
|
|
for small populations (under 1,000) a researcher needs a ____ sampling ratio of about __%
|
LARGE, 30%
|
|
for moderately large populations (10,000), a researcher needs a ______ sampling ratio of about __%
|
SMALLER, 10%
|
|
For large populations (over 150,000), a researcher can use a ___% sampling ratio
|
1%
|
|
for very large populations (over 10 million) a researcher can use a ___ % sampling raio
|
0.025%
|
|
a researchers decision about the best sample size depends on three things
|
1) the degree of accuracy needed
2) the degree of variability/diversity in the pop. 3) the number of diff. variables examined at the same time in data analysis |
|
rule of thumb is to have about ____ cases for each subgroup to be analyzed
|
50
|
|
_____ and _____ bridge the gap in measurement
|
conceptualization and operationalization
|
|
_____ and _______ and ______ bridge the gap in sampling
|
sampling frames, the sampling process, and inference
|
|
sampling error is based on 2 things
|
1) sample size 2) the amount of diversity in the sample
|
|
the sample with ___ cases will have a smaller sampling error and ____ confidence intervals
|
MORE, NARROWER
|
|
measurement extends ____
|
human senses
|
|
pineo-porter scale
|
an occupational prestige scale
|
|
the Blischen scale
|
a socioeconomic index of occupaions that take occupations, education, and earnings into account
|
|
there are three distinctions between the qualitative and quantitative approaches to measurement:
|
1) timing: quan does everything beforehand, while qual measures during data-collection processes
2) the data: 3) Linkages: quan develops concepts before data collection, but qual develops concepts during data collection |
|
quantitative researchers use an ____ approach
|
deductive. this means they take an abstract idea, follow with a measurement procedure, and end with empirical data that represrnt the ideas
|
|
qualitative researchers use an ____ approach
|
inductive. they begin with empirical data, follow with abstract ideas, relate ideas and data, and end up with a mixture of ideas and data
|
|
conceptualization
|
the process of developing clear, rigorous, systematic definitions for abstract ideas/concepts
|
|
operationalization
|
the process of moving from a conceptual definition of the contruct to a set of specific activities/measures that allow a researcher to observe it empirically
|
|
example of an operational definition
|
surveys.
|
|
conceptual hypothesis
|
the causal realtioship between two constructs
|
|
quantitative research sequence
|
conceptualization --> operationalization --> applying the operational definition for data collection (empirical) (this is clearly deductive as we are moving from abstract to concrete)
|
|
in qualitative data, ____ often precedes _______
|
operationalization , conceptualization
|
|
reliability
|
dependability or constistency. it suggest that the same thing is repeated/recurs under the identical/very same conditions
|
|
validity
|
suggests truthfulness. refers to the ability to generalize findings outside a study, the quality of measurement, and the proper use of procedures
|
|
reliability and validity are impossible to acheive because they are ____
|
ideals
|
|
in quantitative studies, there are 4 ways to increase reliability
|
1) clearly conceptualize all constructs: each measure should indicate one/only one concept
2) increase level of measurement 3) use multiple indicators of a variable 4) use pilot studies and replication |
|
measurement validity
|
the only type of validity that quantitative researchers are concerned with. it is how well the conceptual and operational definitions mesh with eachother
|
|
is validity or reliability harder to achieve
|
validity
|
|
3 types of measurement validity:
|
1) face validity
2) content validity 3) criterion validity: concurrent and predictive |
|
face validity
|
easiest, and most basic. it is a jugement by the scientific community that the indicator really measures the construct.
|
|
content validity
|
special type of face validity. requires that a measure represents all the aspects of the conceptual definition of a construct (e.g. feminism is in many areas of life and all areas must be addressed)
|
|
criterion validity
|
comparing it to another measure of the same construct that is widely accepted (agrees with an external source). There are two subtypes: concurrent validity, and predictive validity
|
|
concurrent validity
|
a subtype of criterion validity. relies on a pre-existing and already accepted measure to verify the indicator of a construct
|
|
predictive validity
|
a subtype of criterion validity. relies on the occurence of a future behaviour that is logically consistent to verify the indicator of a consruct
|
|
most qualitative researchers a more concerned with ____ than validity
|
authenticity
|
|
authenticity
|
giving a fair, honest, and balanced account of social life from the viewpoint of someone who lives it everyday
|
|
how do qualitative researchers feel about reliability
|
they accept the principle but use it rarely due to its alighment with quan. they recognize that since they use mutiple measures, it is often impossible to repeat a study
|
|
Lincholn and Guba's 3 notions of trustworthiness
|
1) consistency
2) truth value 3) apllicability |
|
Lincholn and Guba think quantitative should use the terms ....
|
1) consistency --> reliability
2) truth value--> Internal validity 3) Applicability--> External validity |
|
Lincholn and Guba think qualitative should use the terms.....
|
1) consistency --> dependability
2) truth value --> credibility 3) Applicability --> transferabiltiy |
|
is reliability necessary for validity?
|
yes
|
|
internal validity
|
can alternative explanations for change in the dependent be eliminated?
|
|
external validity
|
ability to generalize findings from a specific setting and small group to a broad range of settings and people. mainly used in xperimental research
|
|
statistical validity
|
was the correct test used?
|
|
discretes are ___ and ____
|
nominal, ordinal
|
|
___ level data is rarely used in social sciences
|
ratio
|
|
scales and indexes produce
|
ordinal and interval data
|
|
mutually exclusive attributes
|
an individual or case fits into one, and only one attribute of a variable
|
|
exhaustive attributes
|
all cases fit into one of the attributes of a variable
|
|
scales and indexes should have ______
|
unidemensionality
|
|
unidemensionality
|
all the indicators should consistently fit together and indicate a single construct
|
|
Cronbach's alpha
|
used to assess unidemensionality. a good measure should be 0.70 or higher
|
|
index
|
the summing or combining of many separate measures of a construct or variable . (i.e. an exam.. each question is graded separately, but you are given the overall grade)
|
|
unless otherwise stated, assume the index is _____
|
unweighted
|
|
standardization
|
also called norming. selecting a base and dividing a raw measure by the base
|
|
scale
|
captures the intenstiy, direction, level, or potency of a variable construct along a continuum. most are at the ordinal level.
|
|
graphic rating scale
|
people indicate a rating by checking a point on a line that runs from one extreme to aother
|
|
likert scales
|
provide an ordinal level measure of a persons' attitude. they usually ask people to indicate whether they agree or disagree with a statement. they need a minimum of two cateogires.
|
|
response set bias
|
the tendency of some people to answer a large number of iterms in the same way out of laziness or psychological predisposition
|
|
semantic differential
|
a scale in which people are presented with a topic and a list of many polar opposite adjectives. they are to indicate their feelings by marking one of several spaces between two adjectives. adjectives have three major classes of meaning: 1) evaluation, potency, and activity. evaluation is most significant
|
|
Guttman scaling
|
a scale that researchers use after data collection to reveal whether a hierarchial pattern exists among responses
|
|
scalogram analysis
|
an application of Guttman scaling. the way of testing the hierarchy. things can either be scaled or errored. the strength of the hierarchy is often presented, with 100 being strongest
|
|
linear path
|
follows a fixed sequences of steps. (quantitative)
|
|
nonlinear path
|
makes successive passes through steps, sometimes moving backward and sideways before moving on. more of a spiral... slight moving upward but nor directly. with each new cycle, new data is gained. Qualitaive
|
|
Universe
|
the broad class of all units that are covered in a hypothesis. all the units to which the findings may be generalized
|
|
grounded theory is associated with ____
|
qualitative
|
|
grounded theory
|
theory is developed through the data collection process... therefore is is grounded in the data. It is not use by ALL qualitative researchers
|
|
qualitative is more likely to examine ____ not variables
|
cases
|
|
the process of qualitative interpretation moves through three stages:
|
1) first order: contains the inner motives, personal reasons, and points of view of those being studied
2) second order= an ackowlegement that no matter what, the researcher is still on the outside looking in 3) third order= links the understand to larger concepts and theories. can teach people who are more distant from the oirigional source |
|
what is the central idea in quantitative research
|
the variable
|
|
attributes
|
categories of a variable
|
|
variables are classified into three basic types:
|
1) the independent
2) the dependent 3) the intervening |
|
5 characteristics of a causal hypothesis
|
1) has at least 2 variables
2) expresses a causal relationship 3) can be expressed as a prediction or an expected future outcome 4) logically linked to research question and theory 5) it is falsifiable |
|
falsifiable
|
capable of being tested against empirical evidence and shown to be true or false
|
|
the logic of disconfirming hypotheses
|
negative evidence is critical when evaluating a hypothesis and is given more importance (a hypothesis is never proved, but can be disproved)
|
|
the _____ hypothesis is correct until reasonable doubt suggests otherwise
|
null
|
|
level of analysis
|
the level of social reality to which theoretical explanations refer. it varies on a continuum from micro to macro
|
|
unit of analysis
|
the type of unit a researcher uses when measuring . (i.e. individual, group, category, etc.) individual is the most common unit
|
|
ecological fallacy
|
ERROR IN CAUSAL EXPLANATIONS. when something appears to be a causal explanation but isnt. occurs when they gather data at a higher unit but want to make statements about a lower unit
|
|
reductionism (the fallacy of non-equivalence)
|
ERROR IN CAUSAL EXPLANATION. occurs when a researcher has small scale unit evidence (individual), but makes overgeneralizations to large scale units (society)
|
|
research projects are presented in one of 5 forms
|
1) periodicals
2) books 3) dissertations 4) government documents 5) policy reports |
|
periodicals
|
- selected, condensed summaries prepared by journalists for a general audience, and they lack many essential details needed for a serioud evaluation of a study
|
|
scholarly journals
|
the primary type of periodical because it is filled with peer-reviewed reports.. although one rarely finds them outside a uni library
|
|
why are qualitative journals more difficult to identify?
|
because students often confuse them with theoretical essays
|
|
dissertations
|
all grad students who receive a PHD are required to complete a work of original research, which they write up as a dissertation thesis.
- it stays in the library of the university that granted the PHD |
|
How to conduct a systematic Lit review- 6 steps
|
1) Define and refine a topic
2) Design a search 3) Locate research reports 4) taking notes there are two types of notes needed: Source file, and content file There are two types of source files: Have files and potential files 5) organizing notes 6) writing the review |
|
ethical issues ask you to balance two values
|
1) the pursuit of knowledge
2) the rights of research participants |
|
most unethical behaviour is due to 2 thingsL
|
1) lack of awareness
2) pressures to take shortcuts |
|
scientific misconduct
|
occurs when someone engages in research fraud, plagerism, or other unethical conduct that deviates from accepted scientific research
|
|
research fraud
|
occurs when a researcher fakes or invents data that they didnt really collect, or fails to honestly and fully report how a study was conducted
|
|
plegerism
|
occurs when a researcher steals the ideas or writings of another or uses them without citing a source
|
|
the law of codes of ethics (3)
|
never cause unnecessary or irreversible harm to subjects; secure prior voluntary consent where possible; and never unnecessarily humiliate, degrade, or release harmful info about specific individuals that was collected for research purposes
|
|
only __ to ___ % of studies involved any person who suffered any harm
|
3-5
|
|
- In Canada, the tri-council policy statement on the ethical conduct of research involving humans specifies that...
|
the researchers have a duty to maximize the benefits that their research has on others
|
|
informed consent
|
an agreement by participants stating that they are willing to be in a study and they know something about what the research procedure will involve
|
|
studies suggest that people who filled out a full informed consent from ______ differntly than those who do not.
|
no not respond differently
|
|
those who did not sign a full informed consent are more likely to ___ or answer _____
|
guess, "no response"
|
|
special populations
|
people who lack the necessary cognitive competency to give valid informed consent to people in a weak position who might cast aside their freedom to refuse in order to participate in a study
|
|
it is unethical to involve incompetent people in research unless the researcher meets two minimal conditions:
|
1) a legal guardian grants written permission and 2) the research follows all standard ethical procedures to protect from harm
|
|
limited coercion is acceptable only as long as it meets three conditions
|
1) it is attached to a clear educational objective 2( the students have a choice to do a different activity and 3) all other ethical principles are followed
|
|
in a study like the Tuskgee, a researcher can reduce the chances of creating that inequality of access to survival in three ways:
|
1) subjects who dont receive the "new improved" treatment continue to receive the best previous treatment 2) researchers can use a crossover design (when a study group gets no treatment in the ffirst phase becomes the group with the treatment in the 2nd phase etc) 3) the researcher continuously monitors results
|
|
confidentiality
|
info with particiapnts names attached, but the researcher holds it in secret from public
|
|
anonymity
|
people remain nameless, even from the researcher
|
|
anonymity without confidentiality
|
occurs when detailed info is made public, but the name is withheld
|
|
confidentiality without anonymity
|
occurs when info is not made public, but a researcher privately links individual names to specific responses
|
|
the statistics act of 1917
|
basically an act was passed to give the government the right to collect all types of data and publish it. it also instilled the legal obligation to protect confidentiality
|
|
Russel ogden
|
became the first and only Canadian researcher to receive a subpoena and be asked to reveal the names of his research participants. he refused, and a judge finally let him go
|
|
captive populations
|
students, prisoners, emploees, patients, soldiers etc.. a special problem with anonymity and confidentiality arises because gatekeepers may restrict access unless they see the info
|
|
cardinal principle of research
|
respect for human dignity
|
|
the Interagency advisory panel on research and ethics (PRE)
|
hold the guidelines to protect participants.
|
|
when you are doing research for a sponsor and they confront you wil an illegitimate demand, you have three choices:
|
1) loyality to the organization (caving)
2) exiting the situation (quitting) 3) voicing opposition (whistle-blowing) |
|
whistle blowing
|
the researcher who sees an ethical wrongdoing but cannot stop it, even after informing superiors and exhausting avenues of resolution. he or she then turns to outsiders, and informs an external audience, agency, or media
|
|
hired hand
|
always gives into sponsors, and does what they want, even if its ethically wrong
|
|
_____ puts a high value on value free and objecitve research
|
positivism
|
|
interpretive approaches see things from a _____ stance
|
relative. no single value position is better than any other
|
|
critical approaches see value-free research as....
|
questionable and a sham.
|
|
therefore, people with interpretive or critical views think a researcher should make his or her own value position _____
|
explicit
|
|
when researchers are debating about the idea of "value-free" they believe it only pertains to ____
|
conducting the study, because they ALL agree values seep through when deciding the TOPIC
|
|
5 alternative ways of "knowing"
|
1) authority
2) tradition 3) common sense 4) media myths 5) experience |
|
gambler's fallacy
|
"if i have a long string of losses playing lottery, the next time I play, the chances of me winning will be greater" (part of the common sense limitation)
|
|
people are misled by ______ more easily than other types of "lying"
|
Visual images
|
|
four errors associated with personal experience as a way of "knowing"
|
1) overgeneralization
2) selective observation 3)premature closure 4) Halo effect |
|
premature closure:
|
occurs when you feel you have the answers and do not need to listen, seek info, or question anymore
|
|
halo effect:
|
when we overgeneralize from what we accept as being highly positive/prestigious and let its strong reputation/prestige rub off onto other areas (i.e. judging a good looking person as intelligent)
|
|
empirical evidence
|
observations that people experience through senses
|
|
scientific inquiry
|
statistical analysis of data
|
|
scientific community
|
a collection of people who practice science and a set of norms, behaviours, and attitudes that bind them together. it is a professional community. for the most part it includes both natural and social sciences
|
|
a discipline such as sociology may have about ____ active researchers worldwide
|
8 thousand (100 active and conducting majority)
|
|
7 steps in the research process
|
1) select a topic
2) narrow the topic through reviewing literature and making a hypothesis 3) design stage. decide details 4) collect data 5) analyze data 6) interpret data 7) inform others |
|
four dimensions of research
|
1) use (basic vs. applied)
2) purpose 3) time dimensions 4) data collection techniques |
|
basic research
|
research designed to advance fundamental knowledge about the social world. this is the source of most new ideas. often lacks a short term practical application. the scientific community is the primary consumer.
|
|
applied research
|
research attempting to solve a concrete problem or address a specific policy question. has a direct, practical application. less likely to enter public domain, and consumers are people such as teachers, and decision makers. there are three types of applied research.
|
|
the three types of applied research
|
1) evaluation research study
2) action research study 3) social impact assessment research study |
|
evaluation research study
|
type of applied research. One tries to determine how well a program/policy is working or reaching its goals. most common type of applied. experimental is preffered
|
|
action research study
|
type of applied research. a researcher treats knowledge as a form of power and abolishes the division between creating knowlege and using it to engage in political action. tends to be associated with social movements.. not value-neutral. many types, but have 5 characteristics: 1) the people being studied participate in the research 2) the research incorporates popular knowledge 3) researcher focuses issues of power 4) researcher seeks to raise conscious awareness of issues 5) research is tied directly to a plan of political action
|
|
social impact assessment research study
|
part of applied research. researcher estimates the likely conscequences/outcome of a planned intervention or international change to occur in the future.
|
|
3 types of study purposes
|
1) exploration
2) description 3) Explanation |
|
exploration
|
research into an area that has not been studied and in which a researcher wants to develop initial ideas and a more focused research question. tends to use qualitative data, and rarely yeilds concrete answers
|
|
description
|
research in which one "paints a picture" with words or numbers, presents a profile, outlines stages, or classifies types. uses most data gathering techniques but use of experiments is limited
|
|
Explanation
|
research that focuses on why events occur or tries to test and build social theory
|
|
3 sections of time-demensions in research
|
1) cross- sectional research
2) Longitudinal research 3) case studies |
|
cross sectional research
|
most social research is this. research in which a researcher examines one point in time. most consistent with descriptive research but can be used with all three
|
|
logitudinal research
|
research in which the researcher examines features of people or other units at multiple points in time. more powerful than cross sectional but costs more. used by descriptive and explanatory researchers. there are three types: time series, panel, and cohort
|
|
Time series study
|
researcher gathers the same type of info across two or more time periods
|
|
panel study
|
a researcher observes exactly the same people, group, or org. across multiple time points.
|
|
cohort study
|
a researcher focuses on a category of people who share a similar life experience in a specified time period.
|
|
case studied
|
research on one or a small number of cases in which a researcher carefully examines a large number of details about each case
|
|
4 data collection techniques: Qualitative techniques
|
1) interviews
2) focus groups 3) field research 4) historical-comparative |
|
4 data collection techniques: quantitative techniques:
|
1) experiments
2) surveys 3) content analysis 4) existing stats |
|
experiments
|
researchers create situations and examine their effects on participants. most effective for EXPLORATORY RESEARCH
|
|
surveys
|
used in DESCRIPTIVE or EXPLANATORY research
|
|
content analysis
|
research in which one examines patterns of symbolic meaning within written text, audio, visual, or other communication mediums. can be used for all three types of research but most likely for DESCRIPTIVE
|
|
existing statistics
|
combines past info in new ways to address a research question. can be used for all types of research but is more likely to be used for DESCRIPTIVE
|
|
interviews
|
Often used for EXPLORATORY and DESCRIPTIVE studies
|
|
focus groups
|
are like interviews, but are conducted in a group of 5-7 people. for EXPLORATORY and DESCRIPTIVE studies
|
|
field research
|
researcher directly observes people being studied in a natural setting for an extended period. USED FOR DESCRIPTIVE AND EXPLORATORY
|
|
historical-comparative
|
examines aspects of social life in a past historical era or across different cultures. Can be used for all three types of research and types can be blended
|
|
in what type of research is theory most evident?
|
Basic/explanatory
|
|
three things that social science theories do:
|
1) explain recurring events
2) explain aggregates 3) state a probability for events to occur |
|
aggregates
|
collections of many individuals, cases, or other units
|
|
concept and its two parts
|
an idea expressed as a symbol or in words (i.e. height). concepts have two parts: a symbol and a definition
|
|
classification concepts
|
some concepts are miltidemensional with subtypes. these are partway between a single concept and a theory
|
|
scope
|
concepts very by scope.. some are highly abstract and some are concrete. more abstract concepts have a WIDER scope
|
|
proposition
|
a realiationship in a theory in which the scientific community starts to gain greater confidence and feels it is likely to be truthful
|
|
we can categorize a theory by 4 things:
|
1) the direction of its reasoning 2) the level of social reality it explains 3) the form of explanation it employs 4) the overall framework of assumptions in which its embedded
|
|
the direction of the theory (inductive vs deductive)
|
1) inductive: begin with a detailed observation of the world and move towards more abstact generalizations. used with grounded theory often
deductive= begin with abstract, logical relationship among concepts, and then move toward concrete evidence. |
|
range of theory (from empirical, to middle range, to abstract theory)
|
1) empirical generalizations= narrow range, pattern among concrete concepts. 2) middle range theory= in the middle, focus on substantive 3) huge frameworks
|
|
there are 3 levels of theory
|
1) micro
2) meso 3) macro |
|
micro level theory
|
deals with small slices of time, space, and people. most of us devote our time to this level
|
|
meso level theory
|
links micro and macro. theories of organizations, social movements, and communities are often here
|
|
macro level theory
|
concern of the operation or larger aggregates. more abstract
|
|
6 forms of theory explanation
|
1) theoretical explanation
2) ordinary explanation 3) prediction 4) Causal explanation 5) structural explanation 6) interpretive explanation |
|
theoretical explanation
|
a logical argument that tells why something occurs and how concepts are connected
|
|
ordinary explanation
|
makes something clear or describes something in a way that illustrates it
|
|
causal explanation
|
the most common type of explanation. used when the relationship is one of cause and effect. three things are needed to establish causality
|
|
thee things needed to establish causaility
|
1) temporal order (cause must come before effect)
2) association (when two things appear together.. doesnt mean correlation) 3) eliminating alternatives (no spuriousness) |
|
structual explanation
|
a set of interconnected assumtions and relationshops. he uses metaphors or analogies so the relationships "make sense". often associated with functional theory. "X occurs because it serves the needs in the system Y"
|
|
interpretive explanation
|
purpose is to foster understanding. attempt to discover the meaning of an event/practice by placing it within a specific social context
|
|
the paradigm
|
thomas kuhn. an integrated system of assumptions, beliefs, models, and techniques. it organizes core theories and methods. there are three main paradigms in social sci
|
|
the positivist paradigm
|
- most widely practiced
-value free - nomothetic (law principles) -likes experiments |
|
interpretive paradigm
|
- believe in science but sees human science as needed to be studied diffwerently
- constructionist views of social reality - inductive, idiographic - verstehen |
|
constructionist view
|
human social life is based less on objective, hard factual reality, than on the beliefs and practices that people hold about reality
|
|
idiographic
|
"specific description". refers to explaining an aspect of the social world by offering a highly detailed pic or description of a social setting process or relationship
|
|
critical paradigm
|
- blends materalistic with constructionist
- not value free -put knowledge into action - praxis |
|
praxis
|
the blending of theory and concrete action
|
|
3 main goals of social research
|
1) explore new angles on previously unexamined phenomena
2) try to discribe info in detail 3) try to explain behaviour |
|
presentist bias
|
the idea that what we are experiencing has only ever happened to us
|
|
assumptions
|
each theory comes with one. should be stated at beginning of research
|
|
positivism (3 points)
|
1) sociology is a science
2) human behaviour can be objectively measured/analyzed 3) social facts are things, because they determine human behaviour/attitudes |
|
critical paradigm (4 points)
|
1) sociological research is limited by those who develop it/those who receive/interpret it
2) research should be comparative/historical in scope 3) it is impossigle for research to be objective 4) understanding inequality is the foremost purpose of research |
|
interpretive approach (3 points)
|
1) sociology is a science, whose purpose is to understand the social meanings of human social action/interaction
2) reflexivity is essential to understanding how we constuct meanings 3) intimate descriptions and details are the ultimate goal |
|
the human subjects approval must contain the 5 following things:
|
1) letter of permission from ethics board
2) Permission to use intruments of others 3) methodology must maintain human rights 4) informed consent statement 5) debriefing statement |
|
levels of interpretation
|
1) first order : comes from subjects directly (quotes), selectively used, effective
2) second order: researchers' inputs on perspectives, loses a bit of the truth 3) third order: academic perspective, correlating info on same topic |
|
falsification
|
Karl popper. "logic of disconfirming the hypothesis" we can never really prove anything. innocent until proven guilty
|
|
experimentor bias
|
noticing only the things that support your prediction
|
|
hawthorne effect
|
You know youre being observed so you change the way youre acting. anticipating what is wanted
|
|
contamination
|
if your different test groups come into contact
|
|
the single most important concept in inferential stats:
|
that your sample matches your population
|
|
mixed mode research
|
doing research of majority, and then doing targeted research to question the missing groups
|
|
a questionnaire should include 4 things:
|
1) an intro
2) summary of study 3) instructions 4) a method of return |