Request for Proposal
Evaluation of Principals on a Path to Excellence
Questions or requests for clarification must be received on or before December 14, 2016, at 5:00 p.m. Eastern Standard Time (EST), per the email instructions spelled out in the RFP. All questions received and the corresponding responses are listed below. This page will be updated each time a new question is addressed.
Q: What would NISL like done differently with the next evaluator for the i3 validation grant? In other words, what didn’t go well with the evaluation activities that have occurred so far and what is NISL hoping will change?
A: NISL is not seeking to radically redesign the evaluation even as it is open to improvements in the design. We are seeking an evaluator that is:
- able to manage a complex research project in a highly competent manner
- identify strong researchers whose skills are well aligned to the tasks at hand
- set appropriate priorities to keep to a tough implementation schedule and not place any data collection activities at risk
- communicate clearly and sensitively with school participants, district leaders and key state staff
- bring fresh and innovative ideas to evaluation design challenges
- closely collaborate with the NISL staff on key design and scheduling issues
- take the initiative in anticipating data collection and analysis issues that may be lurking around the corner.
Q: What caused the attrition to be higher in the treatment group than the control group as of 11/8/2016?
A: Some attrition occurred at the district level, such as districts that reconsidered their commitment or had leadership changes occur before or immediately after treatment began. Some individual principals also opted out of treatment for personal reasons (e.g., health, the time commitment required). It is also possible that attrition due to mobility appears higher in the treatment group than the control group since contact with the control group is more limited, and mobility may not be evident until several months after it occurs.
Q: How many districts in each state are participating in the study?
A: There are currently 27 in California, 38 in Mississippi and 20 in Florida.
Q: How many of the participating districts required approval from their own IRBs to collect the survey data and conduct the observations, interviews, and focus groups?
A: 14 districts (all in California and Florida) required and have completed IRB processes or research agreements with the initial evaluation team.
Q: Did the initial evaluator secure data sharing agreements needed to collect the extant student level data and teacher retention data?
A: Only began the process that has yet to be completed.
Q: Did the initial evaluation design exclude information on all of the issues listed in Attachment A?
A: If the Question meant to ask about “including” the issues flagged in Attachment A in the evaluation design, the answer is that the design did address all such issues and more.
Q: Were some of the issues included in the design but not adequately addressed?
A: Each was well addressed in the design although NISL is open to changes that would strengthen the design.
Q: Is NISL still interested in conducting a cost-benefit analysis?
A: NISL is open to a cost-benefit analysis being provided by an independent evaluator, but this is not a requirement.
Q: Will contract terms and conditions be negotiated at the time of award?
Q: Please confirm that a fee should not be included in this budget.
A: This is a decision for each bidder to independently make for themselves.
Q: Does NISL believe the districts or states will be willing to provide the evaluator with student-level data for all randomized schools irrespective of principals continued participation in the study?
A: Yes, we have confidence that the participating parties will continue to cooperate with the evaluation with only a few potential exceptions. In California there is currently a moratorium on the state sharing such data, but we remain hopeful that it will be lifted prior to the completion of this research and that, if not, districts will provide the data.
Q: Has anyone explored the possibility of obtaining student level data files from the districts or states?
A: Requests for data access agreements have been submitted to all the participating states. Collecting such data from each participating district had been considered, but collecting such data from state departments of education is a much more efficient path, so it is the one that has been the focus of such efforts.
Q: Might it be possible to present a set of technical questions regarding evaluation options to the i3 TA provider in advance of the proposal due date?
A: No. Abt Associates does not have a role to play in the execution of this RFP. Once an award has been made they stand ready to address any and all technical questions the winning bidder might choose to pose to them.
Q: Is every treatment principal in the study, or just some, receiving coaching as well as EDP training?
A: Every treatment principal is receiving coaching and the EDP.
Q: How many unique coaches are serving principals in cohort 1 (Fall 2015)? How many in cohort 1b (Fall 2016)? How many are anticipated to serve principals in cohort 2 (Free Seats EDP, Feb 2017)? How many unique coaches are serving principals in total across all the study cohorts? These numbers will help us to estimate the costs of a coach survey.
A: Coaching is only provided to treatment principals in Rounds 1 and 1b. Principals in Rounds 2 and 3, be they control principals or others, will not receive coaching as part of this study. There are 32 coaches engaged with Cohort 1 and one is working in two states. There are 10 coaches engaged with Cohort 1b, 5 of which also coach Cohort 1 principals. In short, the i3 study has engaged and trained 37 educators to serve as coaches.
Q: The table on page 20 in section 5.4 (“Scale of the Initiative”) of the RFP describes the number of participating treatment and control schools in the study at time of randomization and at present. Can you provide 2 separate versions of this table showing the initial and current numbers for cohort 1 (initially randomized in Fall 2015) separate from cohort 1b (randomized in Florida in Fall 2016)? Do the current numbers exclude or include control schools paired to treatment schools that have decided not to comply with treatment assignment?
A: Please see table-1. Numbers include all treatment and control schools at randomization, and all treatment and control schools that have not experienced principal mobility and continue to comply with treatment and data collection. Control schools paired with treatment schools that are no longer complying with treatment have not been removed from the pool.
Q: Based on the table in section 5.4, page 20, it appears as if treatment schools were more likely to stop participating following initial randomization than control schools. Is this correct, and if so can you clarify why this occurred?
A: In addition to lack of participation due to principal mobility or district-level decision making, some treatment principals ceased participating for personal reasons (e.g., health, found the time required to be more than they could commit to).
Q: Have all participating treatment and control schools agreed to take part in study surveys of school staff and students (e.g., signed a letter of commitment?), or only a subset of schools?
A: All but a small number of schools agreed to facilitate the work of the independent evaluator after receiving an explanation of what this might entail (e.g., surveys of the principal, teachers and students). While such schools have yet to submit letters of commitment, they have continued to faithfully participate in data collection.
Q: Can you provide a brief description of the “free seats EDP” cohort that begins in early 2017? In particular, can you describe the expected number of schools that will be randomized to treatment and control groups, and clarify whether the randomization for the “free seats EDP” is already complete or will be already completed prior to the start of this project?
A: Participating districts were awarded two free seats for each principal they placed in the initial randomization pool and that continue to participate in the study today. Each district has full discretion with respect to allocating these free seats to other principals in the district, to district-level staff, or to school leadership team members as long as these individuals do not work in or directly supervise control schools. Individuals participating in the EDP using these free seats can begin the EDP as early as February 2017. Each control principal who remains in the study for the duration of the data collection associated with the evaluation also earns a free seat to be used by that principal no sooner than Summer 2019. In short, there is no “randomization” for free seats save for the initial randomization of principals into treatment and control groups that has already occurred. Just to be clear, all principals participating in the EDP as part of this study receive this leadership development experience at no cost to them or their school district as these costs are covered entirely by the i3 grant. The same holds true for the “free seats” that will be provided as an incentive for school district participation in this RCT. This being the case, none of the “free seat” participants are treated as study participants. They are simply study beneficiaries.
Q: The deliverables schedule with the final report due June 30, 2020 seems to fall outside of the 5-year i3 grant period, assuming a January 1 2015 start for NISL’s i3 grant. Has NISL received an extension or does it have plans to request one
A: NISL’s i3 cooperative agreement with USDE has been extended to run through June 30, 2020.
Research Questions & Confirmatory Contrasts
Q: A study design with confirmatory contrasts would have been due to NEi3 in fall 2015. Can NISL share that document? The RFP says on page 21 “Given that such a design has already been submitted and approved for this grant, the expectation for the new evaluation team is that it need only be amended as changes are agreed to between NISL and the new team.” Has NISL discussed with NEi3 the possibility of revising the confirmatory contrasts, and to what extent?
A: Please click the link to view the nisl-design-summary. It contains only a single confirmatory question, Research Question #1 in the RFP. Neither NISL nor USDE have any interest in walking away from this question.
Q: NISL includes a comprehensive list of research questions in the RFP, with additional lines of inquiry proposed in the bullets on pp. 17-18. In addition, there are several significant challenges to be overcome in carrying out a rigorous impact study to address the evaluation’s confirmatory research question (e.g., attrition rates, access to CA student achievement data). Given limited evaluation resources and the required confirmatory impact and implementation fidelity analyses, what are NISL’s main priorities among the exploratory questions? Does NISL place a higher priority on carrying out a research design with the highest probability of meeting WWC design standards for impact analysis, or on addressing exploratory research questions related to school outcomes via case study?
A: NISL is interested in seeing the evaluation address all of the research questions as fully as possible. Almost by definition the highest priority goes to the confirmatory question and having the analyses conducted to address it meet the highest WWC standards. That said, our interests in the other questions remain undiminished and we expect they will be addressed not just with case studies but also with surveys of principals, teachers and students that are already well underway.
Q: Across how many districts are study participants (treatment and control) distributed in each state?
A: There are currently 27 in California, 38 in Mississippi and 20 in Florida.
Q: How did prior evaluators secure teachers’ and students’ consent to participate in this research study?
A: Because teacher and student participation to date has been limited to anonymous surveys, only passive consent was required by all IRB processes involved. The initial evaluators provided schools with a letter for parents in case any parents wished to have their students opt out of the study. To our knowledge this option was not exercised.
Q: Has the evaluation secured approval to conduct research in participating districts, if required?
Q: Does NISL have access to the research applications submitted to participating districts and would NISL make them available to a new evaluator?
A: Each district received written explanations of the project and the evaluation from NISL and separately from the evaluator, copies of which can be made available to the new evaluation team. MOU’s were also executed with each district by NISL, and the initial evaluation team completed IRB processes when required. Documents related to these processes can be made available to the new evaluators. We expect that where the initial evaluator was asked to submit an application to a district IRB a similar request would be made to the new evaluation team.
Q: A significant number of principals have attrited from the study as of fall 2016. What are the reasons for the attrition, particularly among treatment principals where the attrition rate appears higher?
A: Some attrition occurred at the district level from districts that reconsidered their commitment or had leadership changes occur before or immediately after treatment began. Some individual principals also opted out of treatment due to personal reasons (e.g., health, the time commitment required). It is also possible that attrition due to mobility appears higher in the treatment group than the control group since contact with the control group is more limited, and mobility may not be evident until several months after it occurs.
Q: Could you please explain how the Round 2 participants will be selected, the likely number of Round 2 participants, and the treatment they will receive?
A: The Round 2 participants are not meant to part of the evaluation. They will receive the same EDP treatment as the treatment principals do, but without coaching. This professional development is being provided at no cost to their district as an incentive for districts to participate in the study.
Q: Currently, when a principal moves from a treatment school, does the principal continue to participate in the EDP? Did the current evaluator continue to collect data from that principal? Does the incoming principal in the treatment school automatically participate in EDP if s/he is a novice principal?
A: If a principal leaves a treatment school in the midst of the EDP they can remain in the EDP, but as the original study school is clearly no longer being treated, they are no longer the subject of data collection save for student test scores that are required for intent to treat analyses. The incoming principal is not automatically enrolled in the EDP.
Q: Who administered the spring 2016 teacher and student surveys? What were the response rates and sample sizes for both the teacher and student surveys in spring 2016?
A: The initial evaluator administered the Spring 2016 surveys. Click to view the response-rates-and-number-of-respondents in each case.
Q: Has the spring 2016 survey data been analyzed or should we include a full analysis of those data in the budget?
A: This survey data has received an initial analysis, but we would expect that the new evaluation team would want to explore this data further for a variety of reasons.
Q: Does the NISL budget (as opposed to the evaluation budget) include funds for survey incentives?
A: It does not.
Q: Did the evaluator administer the baseline teacher survey to all study schools in fall 2015, or to only a random sample of 15 pairs of schools in each state?
A: The latter.
Q: Did the spring 2016 administration of the teacher and student surveys include cohort 2 schools in Florida?
A: No. The second group of Florida schools were not randomized until this past summer.
Q: The RFP explains that teachers and students received a link to complete the survey anonymously. Do the survey data have IDs or other identifiers that will allow evaluators to link survey responses between baseline and follow up? Do the survey IDs allow the evaluator to link survey responses to secondary data from the state?
A: Survey IDs identify the school only, not individual teachers or students.
Q: What major constructs were included in the principal survey?
A: They were roughly as follows:
- Approach to school leadership
- School climate/Learning environment
- Curriculum and instruction
- Teaching and learning
- Student assessment
- Parent/Community engagement
- Barriers to student learning
- Principal autonomy
- School impact
- Teacher professional development
- Teacher responsibilities
- Effect of the EDP
Q: Did the fall 2016 principal survey include cohort 2 schools in Florida?
Student Achievement Data & Analysis
Q: Has NISL or the prior evaluator applied for access to state data yet? In particular, has the evaluation applied for access to student achievement data from the California Department of Education? Was the application approved?
A: The initial evaluator applied for data access agreements in all three states. California at present has a moratorium on granting access to student level data so they encouraged the evaluator to pursue such data with individual school districts. We are hopeful that the moratorium will be lifted before this study comes to a close for the greater efficiencies state level data collection offers.
Q: On page 14 of the RFP it is stated that “In addition, the design includes a second cohort of sixth-grade students for AY 2016-17, using fourth-grade assessment scores from Spring 2015 as a covariate, and also following these students for three years.” Do all schools have a second cohort of sixth-grade students included? Why were fourth-grade assessment scores used as covariates for the second cohort of sixth-graders, instead of fifth-grade scores?
A: Yes, to the first question. We are optimistic that bidders can figure out the answer to the second question on their own.
Q: The RFP (p. 20) notes that “NISL will want to be included in such [data use] agreements to ensure it also has access to de-identified versions of these various data bases for its own internal data analysis.” Can NISL say more about which data sets it wants to access? Does NISL want access to principal, coach, teacher, and student survey data? Interview or other qualitative data
A: NISL wants access to the deidentified student achievement data as well as the survey data. As for the qualitative data, it would depend on what form it takes, and NISL would want to be respectful of any promises of confidentiality the evaluator might make.
Q: What types of analyses does NISL anticipate conducting with those data?
A: NISL will be interested to mine the data to gain a better understanding of the implementation of its treatment than might otherwise be the case, and to learn what it can that might lead to improvements in the EDP and its associated coaching.
Q: Has NISL or the prior evaluator collected teacher rosters of in study schools at baseline for the purpose of tracking teacher retention?
A: The evaluator collected teacher rosters in Fall 2015 and Fall 2016.
Q: What involvement, if any, do districts have in supporting the EDP?
A: There are a variety of things we expect and hope districts will do in this regard. They include encouraging each principal’s full involvement in the EDP, giving them a wide berth to try out new ideas with their school that may emerge from their participation in the EDP, limiting mobility among principals in treatment and control schools as much as practicable, and doing what they can to keep their control schools uncorrupted.
Q: Assuming implementation fidelity indicators and thresholds have already been established, is NISL willing/able to modify them if necessary under a revised design? What implementation fidelity indicators, if any, are virtually locked in, e.g., a high investment in a data collection tool has already been made and would be logistically infeasible to change?
A: NISL is open to change on this front as no “high investment” has been made in any one of these efforts at this time. At the same time, most treatment principals have completed their exposure to the EDP (although coaching is continuing and Round 1b principals are just beginning their EDP participation) and this may limit potential changes. But not knowing what imaginative proposals may come forth there is no reason to declare that nothing can change.
Q: What data sources are being used to measure implementation fidelity for NEi3?
A: EDP attendance records, shipping receipts to confirm receipt of treatment materials, and coaching logs.
Q: Is there a central tool that captures all novice principals’ ALPs?
A: There has not been one until recently. At present we have collected a sample of the ALPs from the first cohort and expect we could acquire more if the evaluator had a strong interest in seeing more ALPs than the ones currently on hand.
Q: In Section 2.5 on page 7 of the RFP, you state that bidders must provide one (1) electronic copy in Microsoft Word or Portable Document Format. However, Section 3 on page 8 of the RFP states that the cost proposal must be packaged as a separate file. Please clarify whether the requirement is for a single file containing all proposal sections, including the cost proposal, or whether bidders should submit two separate files, one containing the title page/executive summary/technical proposal and a second containing only the cost proposal.
A: Two separate files as requested and both attached to a single email is what we are after.
Q: Section 3 on page 8 of the RFP states that bidders must respond to the RFP by addressing the relevant sections, signing an enclosed confidentiality agreement, and supplying mandatory supporting documents. We were not able to locate the confidentiality agreement within the RFP. Would you please provide the agreement for review and execution?
A: Thank you for catching this. A mistake on our part to include it in the RFP. However, in the research agreement that NISL will execute with the new evaluator there will be a confidentiality provision that is designed to protect both parties’ interests.
Q: How many cohorts (of 25 principals each) do they plan to recruit in 2016-17, 2017-18, and 2018-19 by state (e.g., California, Mississippi, and Florida)?
A: During the 2016-17 school year, three cohorts in Florida will be the only additions to the treatment group. These cohorts will be inhabited by treatment principals, facilitator candidates receiving training and a variety of school and district personnel utilizing “free seats.” All cohorts recruited for participation in 2017-18 and 2018-19 will consist only of participants using the “free seats” that were provided to districts as incentives, and none of these participants will be part of the treatment or control group.
Q: In the RFP it states that there are 112 treatment principals (across the 3 states) and 132 control principals. Is there an expectation that if a principal leaves his/her school, the new principal will be required to complete EDP training as well? Or will this school be removed from the study/sample?
A: There is no expectation that such a new principal would complete the EDP, and there are different points of view about the merits of such an occurrence. A school in which a partially treated principal has left currently remains in “intent to treat” analyses regardless of the status of the principal that follows.
Q: Are there other members of a school (or district) leadership team that also completes EDP with the middle school principals
A: Yes, several of our partner districts in the i3 states are sending district staff through the EDP through the use of the free seats provided to districts as an incentive for participation. Some of these individuals are being trained to facilitate the EDP for future cohorts. Members of leadership teams in control schools, and individuals who directly supervise control schools as part of their job, are not permitted to enroll in the EDP until the completion of the data collection for this study.
Q: If Yes: (a) is this mandatory or optional? (b) how many additional individuals can participate (c) what are the typical roles (e.g., assistant principals, department chairs, superintendent, assistant superintendent, etc.)?
A: There is no limit on participation by non-principals. Typical roles for such individuals include district staff working in roles related to professional development, curriculum or principal support, and assistant principals.
Q: What is the expectation around coaching?
A: Every one of the treatment principals will be coached. Coaching begins after Unit 3 of the EDP as principals begin to design their ALP, and will continue for 2.5 school years. The first cohort of treatment principals who began the EDP in Fall 2015 began receiving coaching in January 2016, and that coaching will continue through June 2018. Principals who began the EDP treatment in Fall 2016 will begin to receive coaching in January 2017 and that coaching will continue through June 2019. Coaching includes monthly touch points at minimum, generally alternating between in person school visits and remote conversations via phone or video chat.
Q: What is the principal to coach ratio? Does this differ by state?
A: It ranges from 1:1 to 4:1 across all three states.
Q: Are principals required to have (or meet with) their coach or is this optional?
A: Coaching is mandatory, but it is fair to note that it is met with varying levels of enthusiasm.
Q: Does coaching support differ (or is it tailored) based on the need of the principal (or the school)?
A: It is tailored to meet the needs of the principal, the school context and the focus of the principal’s ALP. However, all coaching by NISL coaches is tailored to enhance the curricular objectives of the EDP, making the NISL coaching model different than a typical professional coach or career coach model.
Q: Do principals receive up to 3 years of consecutive coaching support or only for the first year that they are enrolled in the EDP program?
A: See above response regarding expectations of coaching.
Q: For the case study selection, was a score sheet used to select/identify the six potential sites (two for each state)? Can that information be shared (once the contract is awarded)?
A: Final decisions on cases have yet to be made. NISL has created a pool of potential case subjects for the independent evaluator to choose from. Each nominee must meet an agreed upon set of pre-conditions, and for those that do a set of facts is assembled for each that includes each principal’s ALP design and the views of their EDP facilitators regarding their potential, the distinctive nature of their school and their ALP.
Q: Are there data sharing agreements in place at the state or district level to obtain achievement or other data from participating schools where treatment or control principals are placed?
- If not, will the awarded evaluator be responsible for requesting and executing these data sharing agreements at the district or the state level?
- If district-level data sharing agreements will be necessary, how many districts are included in the study?
A: The initial evaluator applied for data access agreements in all three states, but none have yet been secured so there is more work to be done here. California, at present, has a moratorium on granting access to student level data so they encouraged the evaluator to pursue such data with individual school districts. We are hopeful that the moratorium will be lifted before this study comes to a close for the greater efficiencies state-level data collection offers. There are 85 districts in the study of which 27 are in California.
Q: Would you please clarify as to what is required in the Cost Proposal? The RFP specifies “shall contain all pricing information requested herein, using the same headings described below”, but no additional information is provided.
A: That is a good question that deserved better in the RFP. Our expectation is that the cost proposal will be organized by academic or calendar year and include line items for each senior and other key staff (rate and level of effort) and for staff classes that fall outside of this definition. In addition, there should be line items for fringe benefits, direct non-labor costs such as travel, printing and survey administration, and for indirect costs. Where subcontractors might be engaged, a similar cost proposal should be prepared. In both instances, each budget should be accompanied by a budget justification that meets the same standards as are required for USDE i3 grant applications (e.g., basis for indirect rate).
Q: The Table on page 19 lists “the number of treatment and control principals that are participating in this initiative” at the point of randomization and then as of 11/08/16. Were you envisioning that the 11/08/16 counts correspond to the principals that will participate in the follow up data collection data or might the study attempt to collect some data (if only school records) from the larger number of principals that were randomized? In other words, might it be possible to keep some of the “attritted” principals in parts of the study even if they are no longer participating in the intervention?
A: Unless convinced otherwise, we have little or no interest in following principals who have left their “treatment” schools as it seems not the most efficient way to spend scarce dollars. However, following all the treatment schools themselves for the duration of the study seems appropriate for any intent to treat analyses. At the same time, we have a greater interest in understanding what will happen in schools that hold on to their treatment principals (i.e., the treatment on the treated schools).
Q: What were the reasons that principals dropped out of the study (or the program)? Did entire districts drop out, or only individual principals? Do the numbers in the table on p19 include the matched pairs of the principals that dropped out (i.e., have the matched pairs of attriting principals also been removed from the study and accounted for in the table?)
A: Table 1 in Q&As Set III speaks directly to the first part of this question. The table has not been adjusted by pairs. It only reflects the change in status vis-à-vis the participation of individual schools.
Q: How many districts within each of the three states are still participating in the study at least in some of the data collection?
A: See Table 1 in Q&As Set III.
Q: What are the numbers of teachers, students, and coaches that remain in the study sample, separately for the treatment and control group schools?
A: There are 37 coaches in the study. Our rough estimate is that there are 47,000 students and 1,800 teachers in grades 6-8 of the active treatment schools and 58,000 students and 2,200 teachers in grades 6-8 in the control schools.
Q: Are there any signed MOUs or research applications with the states or districts from which the school records administrative data will be collected? Will the evaluator be able to collect all of the needed student administrative data from the states, or will it be necessary to request student data from each individual district?
A: Data use agreements have been prepared and submitted by the initial evaluator to each state but will have to be resubmitted by the new evaluator. NISL has a MOU with each participating district that require it to cooperate with the evaluation team and that prefigure requests for administrative data, participation in surveys and the collection of student test scores. The MOUs also indicate a preference for collecting as much of this data as practicable from the states. Nevertheless, some data has been collected directly from the participating schools (e.g., student and teacher rosters).
Q: Would you share the study design report and i-3 contrast tool from the previous evaluator?
A: The design report has previously been made available. The three contrast tools (one for each state) are available by request. Please contact David R. Mandel, Director, Research and Evaluation, at firstname.lastname@example.org.
Q: What data, if any, have already been collected to assess fidelity of implementation?
A: EDP attendance records, shipping receipts to confirm receipt of materials, and coaching logs.
Q: On page 17 of the RFP, it indicates that teachers and students complete their respective surveys anonymously. Does this mean that the evaluator will not obtain rosters of teachers and students from participating schools for use in contacting survey participants, and that the evaluator will provide one link for the teacher survey and one link for students to use to complete their survey?
A: The initial evaluator obtained both teacher and student rosters from the schools for a variety of reasons, including tracking teacher retention and identifying subsamples for surveys in larger schools. Survey responses came with a school ID but not individual teacher or student IDs.