Processing math: 100%
+ - 0:00:00
Notes for current slide
Notes for next slide

In-person
session 5

September 19, 2022

PMAP 8521: Program evaluation
Andrew Young School of Policy Studies

1 / 46

Plan for today

2 / 46

Plan for today

DAGs, continued

2 / 46

Plan for today

DAGs, continued

Potential outcomes vs. do() notation

2 / 46

Plan for today

DAGs, continued

Potential outcomes vs. do() notation

do-calculus, adjustment, and CATEs

2 / 46

Plan for today

DAGs, continued

Potential outcomes vs. do() notation

do-calculus, adjustment, and CATEs

Logic models, DAGs, and measurement

2 / 46

DAGs, continued

3 / 46
4 / 46
5 / 46
6 / 46

Effect of race on police use of force
using administrative data

7 / 46

Effect of race on police use of force
using administrative data

Use of force
Use of force
8 / 46

Smoking → Cardiac arrest example

9 / 46
Person Smoker Cardiac arrest Cholesterol Weight Lifestyle healthiness
1 TRUE TRUE 150 170 6
2 TRUE FALSE 170 180 3
3 FALSE FALSE 130 110 9
4 FALSE TRUE 140 140 8
5 TRUE TRUE 120 150 2
6 TRUE FALSE 130 230 3
7 FALSE FALSE 140 250 10
dag {
bb="0,0,1,1"
"Cardiac arrest" [outcome,pos="0.599,0.432"]
Cholesterol [pos="0.415,0.440"]
Lifestyle [pos="0.156,0.317"]
Smoking [exposure,pos="0.243,0.428"]
Weight [adjusted,pos="0.297,0.255"]
Cholesterol -> "Cardiac arrest"
Lifestyle -> Smoking
Lifestyle -> Weight
Smoking -> Cholesterol
Weight -> Cholesterol
}

How can you be sure
you include everything in a DAG?

How do you know when to stop?

Is there a rule of thumb
for the number of nodes?

10 / 46

Why can we combine nodes in a DAG if they
don't represent the same concept?

Why include unmeasurable things in a DAG?

11 / 46

Why do DAGs have to be acyclic?

What if there really is reverse causation?

12 / 46

How do we actually
adjust for these things?

13 / 46

Potential outcomes
vs. do() notation

14 / 46

Expectations


E(),E(),E()vs.P()

Basically a fancy way of saying "average"

15 / 46

Outcomes and programs

Outcomes and program effect
16 / 46

Causal effects with potential outcomes

Potential outcomes notation:δ = 1nni=1Yi(1)Yi(0)or alternatively with Eδ = E[Yi(1)Yi(0)]

17 / 46

Causal effects with do()

Pearl notation:δ = E[Yido(X=1)Yido(X=0)]or more simplyδ = E[Yido(X)]

18 / 46


E[Yi  do(X)]=E[Yi(1)Yi(0)]

19 / 46

We can't see this

E[Yido(X)]orE[Yi(1)Yi(0)]

So we find the average causal effect (ACE)

ˆδ=E[YiX=1]E[YiX=0]

20 / 46
Correlation is not causation
21 / 46

do-calculus,
adjustment, and CATEs

22 / 46

DAGs and identification

23 / 46

DAGs and identification

DAGs are a statistical tool, but they don't
tell you what statistical method to use

23 / 46

DAGs and identification

DAGs are a statistical tool, but they don't
tell you what statistical method to use

DAGs help you with the identification strategy

23 / 46
Thomas Massie tweet
24 / 46

Easist identification

Identification through research design

RCTs

When treatment is randomized, delete all arrows going into it

No need for any do-calculus!

25 / 46

Most other identification

Identification through do-calculus

Rules for graph surgery

Backdoor adjustment and frontdoor adjustment
are special common patterns of do-calculus

26 / 46

Where can we learn more about do-calculus?

Here!

Do-calculus
27 / 46

Rule 1: Decide if we can ignore an observation

P(yz,do(x),w)=P(ydo(x),w) if (YZW,X)G¯X


Rule 2: Decide if we can treat an intervention as an observation

P(ydo(z),do(x),w)=P(yz,do(x),w) if (YZW,X)G¯X,Z_


Rule 3: Decide if we can ignore an intervention

P(ydo(z),do(x),w)=P(ydo(x),w) if (YZW,X)G¯X,¯Z(W)

28 / 46
Backdoor adjustment derivation
29 / 46

Adjusting for backdoor confounding

Backdoor adjustment
30 / 46

Adjusting for frontdoor confounding

31 / 46

Smoking/tar + Uber

Effect of shared rides on tips; use frontdoor magic

Like IV but in reverse:

  • IV: instrument → treatment → outcome
  • Frontdoor: treatment → instrumenty-mediator → outcome
dag {
bb="0,0,1,1"
"Actually take shared ride" [pos="0.528,0.508"]
"Authorize shared ride" [exposure,pos="0.288,0.504"]
"Lots of unobserved stuff" [pos="0.521,0.342"]
"Tip driver" [outcome,pos="0.743,0.518"]
"Actually take shared ride" -> "Tip driver"
"Authorize shared ride" -> "Actually take shared ride"
"Lots of unobserved stuff" -> "Authorize shared ride"
"Lots of unobserved stuff" -> "Tip driver"
}

https://twitter.com/andrewheiss/status/1361686426820222977

More complex DAGs without
obvious backdoor or frontdoor solutions

Chug through the rules of do-calculus
to see if the relationship is identifiable

Causal Fusion

32 / 46
Causal Fusion example
33 / 46
Causal Fusion example
34 / 46
Causal Fusion example
35 / 46
Causal Fusion example
36 / 46

When things are identified, there are
still arrows leading into Y.
What do we do with those?
How do you explain those relationships?

37 / 46

When things are identified, there are
still arrows leading into Y.
What do we do with those?
How do you explain those relationships?

Outcomes have multiple causes.
How do you justify that your proposed
cause is the most causal factor?

37 / 46

100% depends on your research question

Why can't we just subtract the averages
between treated and untreated groups?

38 / 46

When you're making groups for CATE, how do
you decide what groups to put people in?


Slides from lecture

39 / 46

Unconfoundedness assumption

How can we assume/pretend that treatment was
randomly assigned within each age?

It seems unlikely. Wouldn't there be other factors within the older/younger group that make a person more/less likely to engage in treatment (e.g., health status)?


Slides from lecture

40 / 46

Does every research question
need an identification strategy?

41 / 46

Does every research question
need an identification strategy?

No!

Correlation alone is okay!
Can lead to more focused causal questions later!

41 / 46
Moderna EBV trials
42 / 46

A correlational study found that MS was strongly associated with Epstein-Barr virus (EBV) - they don't know the exact mechanism yet, but because of mRNA vaccine technology, they can develop vaccines against EBV and help stop MS. They'll figure out exact mechanisms later. For now, they've started clinical trials.

https://www.forbes.com/sites/roberthart/2022/01/14/moderna-starts-human-trials-of-mrna-vaccine-for-virus-that-likely-causes-multiple-sclerosis/?sh=74f52ca51a04

Logic models, DAGs,
and measurement

43 / 46

What's the difference between
logic models and DAGs?

Can't I just remake my logic model in Dagitty and be done?

44 / 46

DAGs vs. Logic models

DAGs are a statistical tool

Describe a data-generating process
and isolate/identify relationships

45 / 46

DAGs vs. Logic models

DAGs are a statistical tool

Describe a data-generating process
and isolate/identify relationships

Logic models are a managerial tool

Oversee the inner workings of a program and its theory

45 / 46
Green space in Berkeley
Covid green spaces
46 / 46

Plan for today

2 / 46
Paused

Help

Keyboard shortcuts

, , Pg Up, k Go to previous slide
, , Pg Dn, Space, j Go to next slide
Home Go to first slide
End Go to last slide
Number + Return Go to specific slide
b / m / f Toggle blackout / mirrored / fullscreen mode
c Clone slideshow
p Toggle presenter mode
t Restart the presentation timer
?, h Toggle this help
oTile View: Overview of Slides
Esc Back to slideshow