Tips for Introducing Evidence Based Conservation

Design a project to gen­er­ate evi­dence. Teams can accel­er­ate the devel­op­ment of a Con­ser­va­tion Evi­dence Base by think­ing about their con­ser­va­tion engage­ment as a “hypoth­e­sis”, and build­ing into it ele­ments of good exper­i­men­tal design, such as: a clear under­stand­ing of the assump­tions being made in the the­o­ry of change; a hypoth­e­sis of the change we pro­pose to make through an inter­ven­tion; iden­ti­fi­ca­tion of con­trols or coun­ter­fac­tu­als for com­par­i­son with the project; ade­quate mon­i­tor­ing to detect change; analy­sis to deter­mine effect; and, an invest­ment in com­mu­ni­ca­tion of results, regard­less of the pro­jec­t’s “suc­cess.” Shar­ing evi­dence of fail­ures is just as impor­tant — if not more so- than shar­ing evi­dence of success.

Find­ing evi­dence and build­ing an evi­dence base. Sources of evi­dence are many, and may be dif­fi­cult to locate. Some may be found via lit­er­a­ture review using stan­dard sci­en­tif­ic search meth­ods, while oth­er evi­dence will be found in reports, pub­lic doc­u­ments, white papers, data bases, oral his­to­ries, social sur­veys, and many oth­er repos­i­to­ries. Teams should doc­u­ment the meth­ods used (e.g., key­words, data­bas­es, key infor­mants engaged, inter­views con­duct­ed, social media search­es) in build­ing the evi­dence base for their project, and ensure that their syn­the­sis is designed for acces­si­bil­i­ty and peer review. Because many con­ser­va­tion engage­ments aim to address sim­i­lar sys­tems and issues, ear­ly invest­ment in com­pre­hen­sive evi­dence review and syn­the­sis on major themes would ben­e­fit many projects.

Under­stand the con­text for suf­fi­cien­cy of evi­dence. The suf­fi­cien­cy of evi­dence depends on the con­text. What will the infor­ma­tion be used for? There are five cat­e­gories of use that should be con­sid­ered: 1) reduc­ing uncer­tain­ties in the the­o­ry of change and improv­ing adap­tive man­age­ment; 2) avoid­ing and mit­i­gat­ing neg­a­tive impacts; 3) man­ag­ing legal or rep­u­ta­tion­al risk; 4) report­ing to fun­ders and oth­er phil­an­thropic uses; and; 5) influ­enc­ing oth­ers. The spe­cif­ic cir­cum­stances with­in each cat­e­go­ry should be con­sid­ered. For exam­ple, who are you try­ing to influ­ence? If you are try­ing to encour­age engi­neer­ing and insur­ance com­pa­nies to alter pre­mi­ums based on the pres­ence of nat­ur­al infra­struc­ture for flood risk reduc­tion, this will require rig­or­ous evi­dence demon­strat­ing a cause and effect rela­tion­ship. In con­trast, the tes­ti­mo­ny of con­stituents may be suf­fi­cient evi­dence for con­vinc­ing politi­cians of the val­ue of a par­tic­u­lar con­ser­va­tion plan.

Pro­vide evi­dence of cau­sa­tion through exper­i­men­tal design prin­ci­ples. In order to esti­mate the impact caused by an inter­ven­tion, it is gen­er­al­ly nec­es­sary to have data pri­or to and after the inter­ven­tion, and to have the same data from a com­pa­ra­ble con­trol group that does not receive the inter­ven­tion. Exper­i­men­tal design and sta­tis­ti­cal rig­or is relat­ed to the required lev­el of strength of evi­dence. Addi­tion­al guid­ance on exper­i­men­tal design and rig­or is pro­vid­ed in the Mon­i­tor­ing sec­tion and in Appen­dix G.

Cap­ture and share knowl­edge. Knowl­edge man­age­ment and trans­fer can be a high­ly lever­aged con­ser­va­tion strat­e­gy — ensur­ing that the broad­er con­ser­va­tion com­mu­ni­ty ben­e­fits from expe­ri­ence and invest­ments regard­ing what works and what fails. Learn­ing should occur in all phas­es of CbD 2.0. Con­ser­va­tion teams should be atten­tive to advances in knowl­edge that occur dur­ing their appli­ca­tion of the process, and devel­op the sys­tems and dis­ci­pline to cap­ture those advances. Doc­u­men­ta­tion and dis­sem­i­na­tion of infor­ma­tion may take a range of forms.