Thanks for your questions. Overall I think people took the change of setting very much in their stride. Many families had experience of having to do something similar during the war.
Much the biggest adjustment was from a more traditional 'chalk and talk' style of teaching to our more engaging, interactive style. Again, though, the biggest surprise was how easily they embraced this adjustment. You can hear our students talk about the experience in their own words here: http://www.vimeo.com/risingacademies/studentfeedback.
On the Ebola Awareness stuff, I should clarify that this is not from 0% to 21%. They had some knowledge before they went through our Ebola Awareness curriculum, but they knew a lot more by the end of it. Now, all this is based on a 17-item Ebola knowledge test of our own devising, so I wouldn't want to exaggerate the precision of our measures. But average scores rising 21% is equivalent to an effect size of 0.9 standard deviations. That's pretty big for an educational intervention.
Our students did not listen to the radio lessons. They were a nice idea in theory, and much blogged about, but in my experience people's enthusiasm for them waned once they actually heard the lessons for themselves, because the execution was disappointing. If you are interested in knowing more I can share a link to some recordings of the broadcasts that I made. One frustration I have is that while there have been regular surveys to understand how many kids are listening to the broadcasts, I am not aware of any attempt to measure what progress kids have actually made in their learning as a result of listening to them. This is not good enough, in my view: even in emergencies, efficacy matters.