A national place-based formative evaluation of the Indigenous Chronic Disease Package –...

25
Click to edit Master title style Click to edit Master subtitle style 28/10/22 1 A national place-based formative evaluation of the Indigenous Chronic Disease Package – reflections on an innovative evaluation approach Australasian Evaluation Society International, Brisbane 5 th September, 2013 Jodie Griffin , Gill Schierhout, Tracy McNeair, Nikki Percival, Lynette O’Donoghue, Margaret Kelaher, Amal Chakraborty, Barbara Beacham, Alison Laycock, Ross Bailie

Transcript of A national place-based formative evaluation of the Indigenous Chronic Disease Package –...

Click to edit Master title style

Click to edit Master subtitle style

28/10/22 1

A national place-based formative evaluation of the Indigenous Chronic Disease Package –

reflections on an innovative evaluation approach

Australasian Evaluation Society International, Brisbane

5th September, 2013

Jodie Griffin, Gill Schierhout, Tracy McNeair, Nikki Percival, Lynette O’Donoghue, Margaret Kelaher, Amal Chakraborty, Barbara Beacham,

Alison Laycock, Ross Bailie

Acknowledgements• Health service and Medicare Local staff• Community focus group participants• Key informants in regional support organisations• Department of Health and Ageing

Sentinel Sites Evaluation team:Ross Bailie, Jodie Griffin, Gill Schierhout, Tracy McNeair, Alison Laycock, Lynette O’Donoghue, Margaret Kelaher, Nikki Percival, Amal Chakraborty, Barbara Beacham, Jennifer Allchurch, Marcus Goddard, Marianne Hellers, Trish Hickey, Julia Hodgson, Michael Howard, Elaine Kite, Katherine Moore, Andrea Moser, Barry Schrimshaw, Kevin Swift and Zewdu Woubalem-Wereta

Presentation outline

• Context• Aims and objectives• Evaluation methodology • Reflections • Discussion

National Action to Reduce Smoking RatesReduce Risk of Chronic DiseaseLocal Community Campaigns

Measure & funds allocated

Tackle Chronic Disease

Risk Factors

Improve ChronicDisease

Management and Care

Workforce Expansion and Support

Priority Areas

CD Self management Coordination of CD Mgt - PIP IHI and CCSSHigher Utilisation Costs MBS &PBSSubsidising PBS Co-payments

Specialist & Multidisciplinary team care

Workforce Support, Education & Training Expanding Outreach & Service Capacity of AHSEngaging DGPs to Improve Access More people to work in Aboriginal health

Population Health

People, Capability & Communications

DoHA

Decision Support Guidelines

OATSIH

ICDP - $805.5 million investment: 2009 - 2013

Pharmaceutical Benefits

Medical Benefits

Primary & Ambulatory Care

Health Workforce

OATSIH

$ million

Monitoring and Evaluation

Monitoring and evaluation

Monitoring & Evaluation Framework (2009 – 2010)

Sentinel Sites Evaluation (2010 – 2013)

National Evaluation (2012 – 2013)

Measure specific evaluations

Objectives – Sentinel Sites

Evaluation• Monitoring of implementation of the ICDP at the local level

• Identifying changes resulting from the ICDP including early outcomes

• Providing timely feedback on barriers and enablers impacting on implementation

• Contribute to the national evaluation

Location of Sentinel Sites

• 24 sites established between 2010 – 2011

• Phased approach

• Across all States/Territories

• Urban, regional and remote locations

• Site types established

Site establishment

• Purposively selected

• Geographic boundary

• Key stakeholders - AHS and DGP

• Introductory and tailoring visits

Data collection

Site type

Tracking (8)Enhance

d tracking (8)Case study (8)

Program data

Admin data

Key informant

interviews

Clinical

indicators

Focus groups

Cyclical nature of evaluation

1 2 3 4 5

Final report to DoHA

Final report to Sites

Six monthly evaluation cycles

Evaluation methods

Engagement with DoHA

Engagement with sites

Contract signed: March 2010

1st Evaluation Cycle: Aug – Oct 2010

5th Evaluation Cycle: Aug – Oct 2012

Data collection

Administrative data from DoHA

• Medicare, PIP and PBS • Program data

Interviews

• Over 700 in-depth interviews; face-to-face

• General Practice, Aboriginal Health Services & support orgs e.g. Medicare LocalsCommunity

focus groups

• 76 groups; 670 participants• Average of 9 per group

Clinical indicators • 41 Health Services

Reflection points

1. Multiple local sites

2. Cyclical nature of the evaluation

3. Analysis methodology

Multiple local sites (1)

• Multiple sites across a diversity of settings

Multiple local sites (2)

• Provided new insights across a variety of settings

‘Approach is quite different …. because the unit of analysis for the whole evaluation was a series of sites which could be compared and contrasted

along various dimensions of system capacity and development and were also tracked over time. It enabled a level of analysis of local context that

provided rich explanation of the observed differences across sites.’

(ANPHA, 2013)

Multiple local sites (3)

• Enabled site engagement‘It made me feel important and what I was doing important. I knew it was being taken

seriously and fed up the line.’ (Case study site)

‘We have lost the ability to tell the story that supplements the data … and this allows us

to with this style of evaluation … it was more useful for us.’ (Case study site)

Cyclical nature of evaluation (1)

1 2 3 4 5

Final report to DoHA

Final report to Sites

Six monthly evaluation cycles

Evaluation methods

Engagement with DoHA

Engagement with sites

Contract signed: March 2010

1st Evaluation Cycle: Aug – Oct 2010

5th Evaluation Cycle: Aug – Oct 2012

Cyclical nature of evaluation (2)

•Trust developed and provided depth to interviews

‘Site stakeholders valued the opportunity to be heard and have input into the evaluation. The one-on-one interview

format was very much appreciated …. It provided the freedom for people to give their open and honest

responses that they may have not otherwise given.’(Program Manager, SSE)

Cyclical nature of evaluation (3)

•Sites valued access to timely local-level data with comparators‘It has allowed us to look at what we are doing

in comparison to others…’ (Case study site)

‘It has provided a tool on how I could lift my game to try and do better or if something

was working well reflect on why it worked well.’ (Case study site)

Cyclical nature of evaluation (4)

• Resource intensive

– Demanding for team– Domino effect if delays

Analysis methodology (1)

• Realist evaluation combined with systems thinking

‘… ‘realist’ evaluation thinking was used to answer the questions ‘what works, for whom, and under

what circumstance?... More than this, the evaluation helped to foster a ‘systems’ thinking

approach to how the package of measures might be better supported and implemented to achieve

its outcomes.’ (ANPHA,2013)

Reference: Australian National Preventive Health Agency (ANPHA). State of Preventive Health 2013. Report to the Australian Government Minister for Health. Canberra.

Analysis methodology (2)

•Measure manager workshops‘Workshops have been good in helping the

thinking of priorities, and the reports are a good reference and allow discussion. Its hard to get the nuances in a written document. Having two hour

sessions assist with focus once you get report.’ (Measure manager, DoHA)

Analysis methodology (3)

•SSE analysis workshops

‘ What worked were the analysis workshops where we discussed emerging patterns and

themes and involved whole team. The diversity of the team views were valuable.’

(SSE team member)

‘The Sentinel Sites method is an innovative approach to the evaluation of a national

program’ (DoHA, 2013)

‘…approach is quite different to the use of vignettes or case studies of local practice

that are commonly found in national evaluations.’ (ANPHA, 2013)

References: Australian National Preventive Health Agency (ANPHA). State of Preventive Health 2013. Report to the Australian Government Minister for Health. Canberra.Department of Health and Ageing (2013). Response to the Sentinel Sites Evaluation Interim Report: December 2011.

Next steps…..

•Publication of evaluation reports

•Further dissemination•Discussion paper on key learnings

Contact details & reference

Contact: Jodie Griffin [email protected]

Reference: Bailie R, Griffin J, Kelaher M,  McNeair T, Percival N, Laycock A, Schierhout G, 2013. Sentinel Sites Evaluation: Final Report. Report prepared by Menzies School of Health Research for the Australian Government Department of Health and Ageing, Canberra.