Event ID: 1638067
Event Started: 10/22/2010 8:15:13 AM ET
Please stand by for real-time captioning.
next thing you know we have 40 programs. We are really proud of the work we have been doing. The fact all of you were able to come join us, talk about the work you are doing.
We have a very full two and a half days ahead of us. But I am hoping we are going to really accomplish our goals and meet all of our expectations and yours as well. If at any time during the conference you have questions, there are many of us who will do our best to answer at the front desk, come to Liz, me, we are scattered around here.
This is a group effort, a multi-agency approach to recognizing and wanting to collaborate on teacher research experiences. There are two groups that help put this conference together. We had an advisory council, planning committee, thinking about the best ways to make this happen so it's a real opportunity for all of us to talk to one another, not necessarily spend -- waste any of the time we have here.
I would like to recognize council members, committee members, and ask those in attendance to stand. Those who are not able to make it sincerely apologize. You may know there are three or four other big events happening this week. We picked a busy weekend. Many planning committee members and advisory council folks are at multiple places and hope to arrive here and there throughout the next two and a half days.
The first, Louisa Cope, will be here later in the afternoon. In her sted is Chriss to -- thank you for coming. [Applause ]
Bill Valdez from the Department of Energy wasn't able to make it, in his sted is Jeff Stilts. [Applause ]
Pat Johnson from the Department of Education, not sure if she's able to make it yet this morning. Linda Salt Laky from NSF, two people are representing here, Mary Mary Poet and Kathleen burgeon, and also council members, -- [indiscernible] and Jay Dubbener from Columbia.
I would like to acknowledge Gail and Jay at this moment for being real pioneers for the event. I was fortunate to attend Gail's at the University of Rhode Island, they have a passion for teacher experience programs. If you haven't met them, been involved with the programs they run I highly encourage you to talk to them the next two and a half days. They know a lot and without their constant guidance and advice, I am not sure I would have made it through the planning of this.
On our planning committee, the group that met regularly, had many discussions about this week, we have Michael Kennedy from the Department of Energy -- [Applause ] -- I will recognize Kathleen and Mary again from NSF. They really provided a lot of guidance for us on planning all of this.
[Applause ]
Bob Hanssen was our outreach manager from NOAA, he's on the mall preparing for the science and engineering festival happening there. He was a great help.
If I could get Elizabeth McMahon -- Liz and Elizabeth are Teacher at Sea program. I wanted to recognize them this morning. I will probably do it many other times throughout the day.
We will start this morning with our first speaker, Dr. Larry Robinson. Dr. Larry Robinson was appointed for the Assistant Secretary -- served as vice-president for research at Florida A and M and since -- served as --
Dr. Robinson graduated summa cum laude at Memphis State University and earned doctorate in nuclear -- became the first African-American to serve as science adviser to -- education and extension services, served until 2009. In 2008 elected to serve on the ocean research and resources advisory panel, and as a founding member of the National Science Foundation's national ecological observe tore network, NEON, science and technology education advisory committee.
Previously he served as scientist and group leader at Oak Ridge national laboratory from 1984 to 1997. Research interests include environmental chemistry, the application of -- to detect -- and environment policy and management. Welcome, Dr. Larry Robinson. [Applause ]
Dr. Robinson: Thank you, for that introduction and thank you for being here today. What a beautiful place to have a meeting. The view coming in is just overwhelming, impressive, makes it difficult to come in and engage in the meeting. In spite of that, I commend you for taking the time to join us in this very important engagement, because like many of you, I have had the privilege of working not only as a scientist, but also over my career, trying to extend my services to the K-12, undergraduate, and graduate sector, I want to remind everyone, the most important part of my -- left out, I am a faculty member on leave from Florida A and M university. Let's not forget about that.
[Laughter]
It's my pleasure to be here at this morning's conference, the Teacher Experience -- this could not have been possible without the support, collaboration of many hours of working with our colleagues at the department of energy, national science foundation, Columbia University, centers for ocean sciences, Education Excellence. It's interesting I have had some affiliation with all of these entities over the course of my professional career, having spent over 12 years at Oak Ridge national laboratory.
I want to give thanks to the corporate sponsors and the [indiscernible] for webcasting these sessions as well.
NOAA is a trusted steward of the ocean, Great Lakes natural resources. NOAA has to protect live and property in times of natural disaster and is a premier agency for applied sciences.
Although I have worked in partnership with NOAA for many years, my tenure in this capacity is relatively short. I was appointed a few weeks after the deep water horizon oil spill, I was on the campus of Florida A and M university on May 10, fully aware of the event unfolding. I went into that knowingly -- or at least I thought I knew what I was going into. On May 11 I was in the Gulf of Mexico, having dinner just last 11ing with a former colleague at the Department of Energy who is retiring, and one of the questions I was asked was about how much time I spent in the gulf. I thought about it a while, it came to me until about the second or third week in September I hadn't spent a full week in Washington D.C. since my swearing in on May 10. I spent quite a bit of time in the gulf.
I was just there on Tuesday of this week, interacting with some of our colleagues at the Sea Grants annual meeting.
Today I want to talk a little about a couple of experiences as a mentor to teachers, and students, and explain one of the more important developments that has occurred with regard to ocean science and policy in a very long time in our nation. That is the president's national ocean policy which challenges us to increase, coordinate educational opportunities, as I will point out to you later.
I have been a part of -- fortunate to be a mentor to teachers. I want to point out one example, a teacher I mentored in 1992 as part of a teacher research experience at Oak Ridge national laboratory. That teacher happened to be from red bank high school in Chattanooga, Tennessee. Not only did we have a very long and productive summer working together, we also managed to get the result of that experience published in the journal of chemical education. The title of the paper, as you can see here, research and learning opportunities in a research base nuclear -- laboratory. At least one person in the audience has a full appreciation, because I spent a lot of time with that, colleague at a sister laboratory operated by the National Institute of standard and technology.
I would have had a lot more teachers work with me at Oak Ridge, except for that sign down at the bottom here -- I had a lot of students and faculty who signed up to work with me, but got to the bottom, saw danger, radioactive materials, I lost a few people along the way.
I could talk about a larger number of folks, but that particular sign got in the way. In this particular case it was a very productive experience for both myself, and Mr. Brown, and I think the important part of this is these experiences, just like those with the undergraduates, the most successful programs, should lead to something tangible in terms of scholarship. [indiscernible] publication system a good outcome that shows the experience was well worth the effort.
I know also, through this experience working with teacher and students, that education plays a key role in stewardship. It's why I like to focus your attention to the new national ocean policy.
On July 19, President Obama signed the policy and acknowledged the need to look at the oceans, coast, Great Lakes ecosystems more holistically. We regulate human activities at the federal level, but the 140 statutes, regulations and policies, sector-by-sector, issue by issue approach misses the big picture. Therefore the national ocean policy has agencies coordinate instead of isolate, incorporate instead of compete.
Our oceans, coast and Great Lakes play a critical role in the life of every American. Coastal counties are home to over half of America's total population and generate approximately 57% of our gross domestic product. Coastal regions also provide enormous environmental benefits. Shallow coastal wetlands provide a buffer against coastal storms. Coral reefs serve as nurseries for many species. Estarys and bays -- and also hold recreational value as witnessed by the many millions of people who come for vacation. Coast, Great Lakes, oceans do indeed matter. The national ocean policy is based on the recommendations of the national ocean policy task force. These recommendations were formed with public input at six regional meetings, and a series of 38 expert roundtables and thousands of written comments submitted by mail and electronically.
Based on this input, the policy identifies the shared values among stakeholders depicted on this slide.
Balanced ocean health and community prosperity, level the playing field for all stakeholders involved in the regulation, management, stewardship of oceans. Respect the unique character the each U.S. region, make decisions based on the best available science. That's one of the key elements of the policy. The best available science forms the foundation for policy decisions that will be made in the regulation and management of our coastal oceans and Great Lakes resources.
The national ocean policy has nine priority objectives, they are divided into two categories. The if I first is how do we do business, and the other side represents the areas of special emphasis. These represent a substantive area of particular importance to achieving the national policy. Over the course of six to 12 months the national ocean council will develop strategic action plans for each of these broad objectives.
I want to give you a sense of things already underway. On September 24 the Deputies of the national ocean council -- consists of 27 federal agencies, all indicated here have a seat on the council. NOAA and the Department of Commerce have a seat among the 27 members. On September 24 the first Deputies meeting was held. I had the privilege of representing NOAA. Presidentially the national ocean council is scheduled to have its first meeting on November 9, here in D.C.
They will be looking at a set of issues the Deputies talked about, some organizational issues, charters, committee structure, et cetera. But I want to point out to you the national ocean council is off and running. These strategic action plans are going to be available for all of us to look at within the next six to 12 months.
I want to focus on one of the how do we do business priorities in the national ocean policy. That is inform decision and improve understanding. This priority is one NOAA does a great deal of work in and we do this work in collaboration with all of you. As teachers, scientists, engineers, managers, you work in partnership in education programs around the country and play a key role, will continue to play a role in this priority.
I want to point out before I go to the next priority objective, more on this one. One of the shared values on the previous slide I failed to -- respect, but you need character of each region. That means for the national ocean policy to be successful in terms of implementation, although it needs a national-level coordination, the actual implementation has to be from a bottoms up approach. We have to really respect the unique characteristics of each region around the country. So the program in terms of implementation of many -- will be primarily based on a regional approach, the plan, as opposed to dictated by those of us in Washington.
As explained in the national ocean policy, the goal of the informed decision and improved understanding objective is basically to use the best available science and knowledge to inform decisions acting, effecting our oceans, coasts and great lake and enhance humanity's capacity to understand, respond and adopt, adapt to a changing global environment.
All of you in this room understand much better than most. You have given of your own time and efforts to participate in teacher research programs over the years. These programs provided opportunities to be part of research that takes place in the remotest areas of the ocean, arctic poles, high in the atmosphere. This knowledge is brought back to your classes, schools, colleague and your communities, to impact a new understanding of real-life scientific research and is relevant to our very existence on Earth.
NOAA is committed to providing the best science possible to help inform decisions and improve understanding. We provide cutting-edge valid scientific research and also provide data and research in ways that can be taught, communicated to the non-scientific community.
I will highlight a few of our educational programs at NOAA.
The mission of NOAA's national marine sanctuaries, to conserve, protect, enhance diversity, cultural legacy. Our national marine sanctuaries are protected waters where giant hump back whales breed, flourish, and ship wrecks tell stories of the maritime history. A safe habitat for species close to extinction or protect historically significant ship wrecks, ranging over 100,000 square miles, cherishes recreation spot and valuable commercial industries.
In July I had the opportunity to participate in the Monterey bay sanctuary's ground breaking in California. A green, sustainable building in this setting will provide enhanced opportunities instill in visitors a sense of personal stewardship with regard to the sanctuary and an understanding of how to have -- protected.
The sanctuaries have been leading a mission called "If Reefs Could Talk." has scientists, educators working together talking about diversity, through live web broadcasts in English and Spanish. Several school and aquariums have had the opportunity to interact directly with the Aquanaughts while living in an underwater habitat.
NOAA is engaged through the national estuary reserve -- a network of 27 areas representing different regions of the U.S. protected for long-term research, water quality monitoring, education and coastal stewardship. Each reserve is managed on a daily basis by state -- input from local partners. I am sure many of you have been to an est an estuarine reserve. I have had the privilege of working with -- since 2001. I was just trying to determine how many students have I had since we started environmental sciences program, who have done research around the country. On the short ride here I was able to count five doctoral students of my own who I advised on dissertation projects, based on research in one of our national est nationallest reserves.
Through a summer program, K-12, teachers, 25 to 35 students, significant amount of time looking at educational research associate wed that reserve system.
Some of you might know about estuary live, one of the programs that investigate estuaries around the country, produces live and broadcast programs, covering a wide range of topic related to estuaries and coastal ecosystems. NOAA has a wide variety of student opportunities. Just a few weeks ago we opened a call for applications for two of our premier undergraduate student scholarship programs, the Holings Scholarship and the educational partnership program scholarship. Both provide students with academic financial assistance, a 10-week full time internship opportunity during the summer at a NOAA facility, with hands-on educational training in NOAA related science, research, policy management activities.
For more information on these opportunities, please visit the website listed here. I have personally seen students take advantage of these programs, and in particular at getting students to transition from undergraduate to graduate level we have been very successful in the NOAA center through the combination of educational partnerships undergraduate program and sciences program, generating perhaps the largest single number of underrepresented minorities in the NOAA science and technology workforce since 2001. We have, through this program, I believe up to -- about 15 PhD students through this program who have been or are currently employed at one of NOAA's line offices, or here in the D.C./silver springs area. This has been a successful program for me and my colleagues associated with the center. I strongly advise you to have your students take advantage of these programs. They are gateways to the workforce at NOAA.
I was just at the Seagrant meet nothing New Orleans earlier this week, astonished to find through a survey NOAA recently conducted of employees, 22% of the persons who responded to the survey were former Sea Grant scholars. These type programs are very instrumental in terms of workforce development, but also as mechanisms to get students employed at NOAA and perhaps the same at DOE and other federal agencies in attendance.
A big part of why we are here today is NOAA's premier teacher program and research experience opportunity. That's the Teacher at Sea program. The program has given more than 600 kindergarten through college level teachers from every state, including Puerto Rico, American Samoa, Argentina and Chile, allowing teachers to enrich the classroom with a depth of understanding made possibly only by teachers working side by side with scientists. Our scientists contribute to the world's body of scientific knowledge and share with these teachers.
For those who have been a Teacher at Sea, I thank you for your dedication and contribution to this program. We are very proud at NOAA of the program, increasing the breath, adding through new programs, teacher in the lab, teacher in the field and teacher in the air. I would also like to thank the teachers and scientists who participated, applaud your willingness to reach beyond the classroom.
During those 12 years at Oak Ridge, for summer, students or students and teachers doing summers, sometimes academic years, awfts oftentimes we get no additional commitment, but the scientists who work at places, national laboratories and NOAA, really comenneddable Oftentimes this is an additional duty on top of everything else you have to do as a research scientists. That's really commendable.
Finally, what oceans promises is boundless. We have learned its resources are not. We need to embrace a new ocean ethic, one that recognizes the link between ocean health and our own prosperity, well-being and security. One of the lessons critical lessons from the Deepwater Horizon is the link between the oil spill, health and our prosperity. In order to move forward we must forge new partnerships, collaborate and share information, just as you will be doing over the next two and a half days of this conference.
Supporting ocean research or another Earth science, you are contributing to our nation's goal, role in enhancing stem education. Therefore, I would like to thank you again for your willingness to spend this, what appears to be a beautiful weekend, working to make these programs even more successful, by establishing connections and partnerships among your programs we hope you collaborate to meet individual project challenges and come away with an ability to make real changes and contributions to your program and to the ultimate goal of enhancing stem education and the stewardship of our resources that will follow.
As a scientist and educator both, I appreciate being invited to speak to you this morning. I look forward to seeing the results of this conference. Thank you very much.
[Applause ]
I will take a few easy questions from you this morning.
The first one is always the toughest.
Question: [indiscernible]
I think there's a resource part of that and -- in order for the policy to be successful, we can take all the good policy decisions we want in terms of regulating things, but until we figure out the ultimate challenge of changing behavior through stewardship, then we will not be able to reap all the benefits of the national ocean policy, or policy objectives. To do that it takes resources and sometimes, particularly now, new resources are very difficult to come by. As you know, we are in the midst of a continuing resolution, but I do think, however, it does present a challenge for us to better utilize the resources we have, if this is indeed a priority for us. Then it has to be manifested how we allocate our resources to help the nation implement the policy, the goals.
The educational part is something I think we often take for granted, often even that agencies somehow, the talent needed to provide the science to undergird the best available science for the decision is somehow going to appear on its own without a focused effort. I happen to be from one of those disciplines, nuclear science, where you have to pay particular attention to it. If you don't, it's not going to happen on its own. The same thing really applies equally to ocean-related sciences, but we have had the privilege of not seeing the decline in the nuclear education arena back in the 90s, but we need to pay attention, make the appropriate investments to ensure we prepare the workforce in the right areas to sustain our efforts in ocean, coastal and Great Lakes stewardship.
Question: I am one of those ground-level people, teach elementary school, fortunate enough to be the benefactor of the three programs. None of those so far have I had to quantify my results for data management, to show the impact of my teaching, post-program. How are you managing to save that office, requirement, important to us we don't have to operate in numbers. A lot of our results are intangible, can't quantify. My fingers are crossed that will continue, but the -- from Washington is data.
We have at every level, having outcomes from the investment provided to us by Congress, Office of Management and Budget, is very important in order for us to make a case for holding on or enhancing these resources. I don't want to -- it's easy to count things, right, easy to say we produced this many, that many. It's a lot more difficult to measure impact. You can say what I produced, 10 students, may be not very good students compared to 10 good students who went on to do great things, in some way less tangible. We haven't gotten out of the woods on that. We will have to continue to work out ways, even those intangible outcomes, to put them in some manner that people can understand what these impacts are. For example, one thing to produce a good student in science and engineering technology. That student goes away helps some other nation improve his -- another thing to talk those students who reproduce, manage to stay here, drobt this nation's bottom line. There's a number of things used, intangible, but fairly important. We don't often do that.
At USDA we had a lot of fun trying to measure the impacts of education and extension activities, right. Looking to see how -- what are impacts of best practice impacts, a fact sheet some farmer knows, here's the best way to apply fertilizer, for example. What you don't know is, okay, is that really making a difference in the amount of nutrients finding their way to some aquatic ecosystem. That's the impact part. The impact comes, oftentimes a little bit longer in time than does the immediate tangible thing. The time you produce the student, that particular number is going to impress Congress, but the student that goes on to be a Nobel laureate, that will be down the road. We have to get people to think about the intangibles, not just the beans, but how the beans flourish and contribute to the nation's bottom line. But I will tell you it's not easy. We still have to work very hard as educators making people have a better appreciation for these intangibles. I haven't quite figured it out but I am welcome to your thoughts about it. It is a significant challenge.
Question: Good morning. I have a question about the new components you referred to, the lab, field, air, for the Teacher at Sea program. Is that all incorporated into one summer at sea or three separate programs?
I will have the expert speak to that. I am not sure.
Actually, it's several different pilots, not all one, four separate programs. You can learn more about that during the poster session this afternoon.
Thank you.
Somebody needs a referee out there because -- it's getting interesting. I don't want to be the one.
Question: In the wake of the recent Deepwater disaster, do you think NOAA will play a role to bridge the gap in awareness, emergency management, livelihood of the coastal communities?
Yes. [Laughter]
Please expand.
Let me expand. I will take a few minutes to expand. There's a couple major things occurring right now as we speak. First, NOAA is a natural resource trustee, and in the damage assessment process, NOAA plays a key role in helping identify what impact the oil spill has had on those trust's resources, coastal, ecosystem, fisheries, et cetera. We will be engage indeed this process in this process for some time. The assessment is a legal process, so it's going to involve things that, when you do the assessment, you don't necessarily want to talk about -- you have a responsible party, parties here building their own case. But NOAA is intimately involved in that.
A few weeks ago the secretary of Navy released a report on the long-term ecological implications of the oil spill on the Gulf of Mexico and beyond. As a result of that, the president established a Gulf Coast ecological task force that's going to be chaired by EPA administrator Lisa Jackson to look at some of the long-term implications, eco logically and otherwise, of the oil spill, beyond what is covered in the natural resource damage assessment process. NOAA, including yours truly, I will represent NOAA on that task force. That task force itself had its first organizational meeting a couple weeks ago and will have first full public meeting on November 8 in Pensacola, Florida.
We are identifying what are those issues that we need to point -- need to think about, ecological, economic and policy. There are a number of questions about how we are going to regulate things differently in the gulf with regard to oil and gas exploration around the country. What are going to be some of the things necessary so that we are better prepared for the next event that occurs.
About -- I guess two weeks ago there was a meeting of principle investigator scientists sponsored by JSOF, the joint science and ocean technology -- joint committee on science and technology in St. Pete that had a session of principle investigators on what type resource they have done, needs to be done, not just ecological sciences, but from a socio economic perspective, beyond just economic issues, what are the impacts on people, communities, culture, et cetera.
At that meeting, where I spoke, I said one of the responsibilities we have now, not just in the federal government, but the scientific community as well, we need to be a lot further along in terms of understanding how to mitigate the effect of oil spills, understand the impacts on people from these type catastrophes than we were during the Exxon Valdez and Deepwater Horizon, all of us collectively be much better positioned to manage the event, respond, and understand the implications. We will have to do that together. It's not just the responsibility of the federal government, but all of us.
We have time for one more question.
Question: Piggybacking on that whole topic, how much has anybody given thought to research experiences for teacher and students in cleaning up the spill, prevention, things along those lines?
Okay, I remember, very early on, my second day on the job I was in the gulf and we had a series of Sea Grant -- at a series of town hall meetings, people from various sectors of the community, teachers, ministers, fishermen came to express their concerns, have questions answered about various things.
At one of those meetings, I believe in Alabama, where I spoke, participated on a panel, I got a question very similar from a teacher who was a recipient of one of our B -- and many scientists who work in the areas could not go do the routine things they did because it required a higher level of training, OSHA training, this was a contamination event, no longer the places impacted -- no longer treated the same. Not any person was allowed to go into those areas without appropriate training.
When this teacher asked about students going to the area I was very concerned. Well, maybe not, because of the training requirements, but the real safety concerns about them being exposed, other activities going on, burning, et cetera.
However, we also convened a meeting at NOAA with all of our educational entities, maybe in August, September, said this is an educational moment. It is truly a disaster but there's a tremendous educational opportunity here and whether we have people actually in the field where there's oil, or people looking at baseline information to determine what the oil has done, I think it would be just as sad not to use this to somehow expose students to the opportunity to learn from this as we will learn as a society, I think that would also be a mistake.
In terms of going directly into certain areas impacted by the oil, the response activities going on, probably not a good thing for a lot of people, students and/or teachers and even many scientists. We had to get special training to do so ourselves.
Now that things are beginning to -- well -- wind down in terms of the immediate impacts of the oil, I have gone on a couple of cruises, research cruises where educators are being involved, because they were not in areas immediately impacted by the oil. We were collecting baseline samples in anticipation of those areas being impacted later so we would know how to compare before and after impact.
Thank you, Dr. Robinson, for coming this morning, answering our questions. We appreciate your time.
[Applause ]
Good morning, everyone. I thought I would quickly introduce myself. I am Elizabeth McMahon, I corresponded with many people in the room over the years, it's nice to be able to begin to put some names with faces of those I haven't met yet.
So welcome. I want to quickly go over logistics, so you can figure out where to go, when, where the bathrooms are, things like that. Let me give a little background about myself.
I have been at NOAA about knife five and a half years now, at my previous life a taught at Virginia Tech. I used to teach myself, so I can definitely commiserate, I was an instructor, not faculty member. I consider myself one of the ground forces. After about five years of teaching I got burned out. So I went and looked for another job, was lucky enough to end up at NOAA. It's been an absolutely incredible experience ever since. Sometimes I miss being in the classroom. My favorite part was being with the students. The rest -- I could take or leave.
Being in the classroom is the best part and I commend the teachers out there that do that every day. To this day, I don't know how you do it for so long. My mother is a teacher, she just retired, taught about 30 years. Every day I think about the fact that she did that every day of her life. It's just a tremendous, tremendous thing you guys do.
Anyway, with that, being said, what we will do -- we are going to -- couple highlights -- you see the agenda, we have the speakers, going to have a break.
I want to highlight the round table and the poster sessions. For the roundtables, I want to remind the facilitators you should have been given a few directions, a page, if you didn't get that, it's out at the registration table, so you know.
Those who aren't facilitators, you are welcome to go to any table. We will all be discussing the same things. You can stay where you are, or if there's a facilitator you happen to recognize on the list you want to sit with, that's fine.
We're going to spend half an hour trying to go over the first part of the questions on the roundtable question list and then the other half hour going over the other questions.
I want to point out, it is not imperative you answer all the questions. Okay? So, for those who are type A, think you need to get through them all, it's okay if you don't. They are really just to stimulate discussion, figure a few things out.
We also will need a note-taker for each table. Preferably with a laptop. If any of you here with laptops would like to think about volunteering to do that, that would be an absolutely wonderful thing for you to do.
We will the roundtables, then have lunch, and then we will have our poster sessions. Today the poster sessions, I think most of you have been already. In case you haven't, go out these doors, take a left, another left, and in the hallway to the back of this, parallel to these rooms, J, K and L.
Tomorrow they will be across the hall in 10, 11 and 12. Those who have posters today, if you haven't already set up your poster, you will have some time, I guess after lunch to do that. One other thing I want to point out, those who have posters for tomorrow, if you have been working on your poster, please make sure the poster you are working on for tomorrow is put under the table or up against the wall. The people who have posters out today need all that space. Make sure you put yours under the table and you can move it tomorrow.
Okay. Then, after lunch -- and the poster sessions, we have a break. As you have been working at your station, if you are presenting a poster, you can take a break. Then we will have more speakers, and then will have a wrap-up. Don't forget we have the reception at pow's lounge, we hope to so you there.
With that, being said -- couple other things, Wi-Fi, in the hotel lobby, also across here in the cyber cafe. I think that's it. Any questions?
Oh, I forgot the most important thing, the bathrooms. If you haven't found them, out -- a right, and walk toward the -- past the escalators, there's -- women on the right, men on the left -- both, I think, each side, men and women, two either way.
Any questions? Without further ado, I will introduce our next speakers. We have speakers on evaluation, know how important this topic is. Dr. Bora Simmons, the project initiated in 1993 by the North American association of environmental education to help educators develop and deliver effective environmental education programs. After 20 years as a profreesor at northern Illinois university, Bora retired and moved to the institute for a sustainable environment at the University of Oregon. Bora has been actively involved in environmental education research, evaluation and professional development over 30 years, taught courses, given presentation and facilitated workshops internationally. Served on the AEE board of directors and chair, for social studies, environmental education, SIG and CATE, double E standards writing committee and EE network. Serves on committees, boards, including the national project learning tree education operating committee and environmental education and conservation global.
For achievements in service, received various recognitions including Walter E. Jees key award for outstanding contributions to environmental education, research and applied research award.
She earned her BA in anthropology from UC Berkeley, environmental education from Humboldt State University and PhD in national resources environmental education from the University of Michigan. After graduation from UC Berkeley served as peace Corps volunteer in South Korea cor. Thank you; welcome Bora Simons.
[Applause ]
Other speaker is Dr. Howard Walters, Ashland University, author of 41 scholarly, peer-reviewed -- national or international presentations. He has been the first author, principle investigator of 67 federally funded research proposals over the last 17 years, focus as K-12 teacher education pipeline to encourage development, science and marine education, zoos, aquariums, museums and science centers. He authored one book on professional development for science teachers, served as internal and expersonnel evaluator for regional and science education programs, agencies including the national science foundation, NASA, NOAA and the national ocean partnership program. These products included Operation Path finder, global environmental education, NSF teacher enhancement project, Globe, the NOAA funded component, and the regional Sea Grant systemic initiated -- for invasive species, Florida, Alabama, Mississippi -- Texas Sea Grant programs.
Dr. Walters served as external evaluator for the -- education, CORE, national science -- since 1998, a variety of other K-12 student and teacher education initiatives for national geographic -- reserves, NOAA's office of exploration, state, college Sea Grant programs, expersonnel evaluator for -- standards, working with National Geographic and not for profit the college of exploration, external evaluator for NSF, Great Lakes, coastal trends programs.
The primary scientists and science teacher professional development programs in the Midwest and -- external evaluator, researcher for the Great Lakes regional -- and external evaluator and learning scientist for the Columbus view and aquariums education department. Please welcome Dr. Walters.
[Applause ]
I really think you two need to get a little more busy, you don't have enough to do.
Dr. Walters: We'll work on it.
The room is blurry, these people need to work on that, or I better find my glasses. Is it stuffy in here? Everybody stretch, come on.
All right. The teacher in me wants to rearrange the furniture on the platform.
I know --
The agenda says we're presenting one after the other. That's a lie. We are going to just talk, and it's so nice to actually meet her. I never met her until just this morning, about an hour ago.
Wandering around, Howard --
I had my tag backwards -- we were introduced by -- I have her books on my shelves, but we were introduced via e-mail by Jennifer, in association with this meeting and we started talking on the telephone. It was really great. It was so great we decided, instead of separate PowerPoints and the formal click through and talk, we were just going to talk to each other in front of you today and --
Go about your business, have a nice conversation -- how long have you been doing evaluation stuff and --
Dr. Simmons: I got involved when some of you were just twinkles in the eye of your parents. Back in the 70s, I had just graduated from my master's degree, working for the mend seen owe county schools, doing the things newly minted teachers do, setting up workshops, teacher professional development. Substitute teaching on the side to make the rent. Doing all those things and the county schools, because being county schools, would periodically say, here, here's another project. Why don't you conduct an evaluation. I had no clue. I bumbled through a number of different projects for the county schools, and I will tell you a story about one later.
It really -- what ended up happening, in doing that I recognized I needed more expertise. It was through evaluation that I decided to go on for my doctorate at Michigan. It was very much that notion that I knew, I thought, how to put on a good program; and I certainly was in a point with environmental education, my chosen profession of what I thought needed to happen in the field, but I didn't know how to document for others and understand how my programs were really working. So that pushed me off into evaluation and further education.
Interesting.
How did you get --
That is -- we have matching ones now --
You are so lucky, a Michigan person -- you are talking to a person from the Buckeye state.
I was in the band.
How did I get started? I was an elementary school teacher, a Montessori, early childhood educator. I saw another Montessori person here today -- where? Yes! Inthee back. In the back, on the rug.
It's interesting, the things that prompt us to ask these questions. I was teaching schools in an urban setting, and I loved to take my kids on field trips, back when we could do field trips without standards alignments, I hear chuckles. There are classroom teachers here. I had, as good teachers do, I had been scraping around for money to pay for buses and I found Sea Grant. They were very gracious to provide funds, I take my kids to the beach in Pensacola. Grew up in Pensacola, Florida, white sort of oil-stained beaches in Pensacola.
We would take our kids to the beach and what got me into evaluation, Bora, a child named jay neck Jannika.
You go over the Bob sikes bridge, out to the beach, the EPA lab over to the right, big tall humped bridge. You hit the top of that bridge in the daytime, the white sand beaches go left to right out in front of you, Pensacola, I take these kids over and Jannika sitting on the front seat next to me, second grade. She lived in a housing project, and we hit the top of the hump bridge, she started hollering as only a kid can holler, look at the snow, look at the snow!
You are thinking, what the Hell! It made me -- this prompted me -- landed me in evaluation, really permanently. I started to ask this question like how can the world be that screwed up that a little child that grew up three miles from the beach in Florida! Sees white and her first thought is snow. That's wrong!
I went back to school to study what's wrong with the world and, took a doctorate in measurement theory. My wife says that's what's wrong with me. I was a Sea Grant doctorate fellow at Mississippi Sea Grant. My major professor, Sharon Walker, I walked in the first day, she said your job is to run the evaluation for all of my programs. Well, how many are there?
She said well, I report 37. All downhill from there.
And thus we get into the world of evaluation.
Oh boy.
Soak it in. We love terms. Evaluation isn't any different from anyplace else. Am I on?
Hold it up.
It's got to be my best friend.
What's your favorite one?
I will say I am a total geek about evaluation. There's going to be groans here, prepare. I love logic models.
Okay.
It's one of those thing -- I know, but to me logic models, for those who are not familiar with them, have not been blessed with the opportunity of writing a logic model, it just maps where you want to go. What do you have on hand? What are you going to do? How many of them? What's that going to mean in the end? It's a map. I love it. Because it does map it out for me, I can keep going back to my logic model. That's my favorite, of all the fumpgy things. You have a favorite?
As a person, long-term educator, I hate accountability, because we all know none of us want to be accountable for anything. It hangs up in the sky for me, but I really like impact. That is one that has really registered with me over the years. It points out the structural flaw in the entire federal government. It also points out the greatest hope with -- you weren't expecting me to say that were you? It points out the greatest opportunity we have to really make a difference
Impact, what happens over time -- evaluations, money for that runs out right about the point you can start looking for impact. That's a problem. But impact, change over time, fundamental question we have to get to. The confusion, we have responded to this, should do a little commercial announcement. Bora and I wrote a paper together on these issues, six key issues. You have a copy in your material, on your thumb drive thing. This is where we will start walking through some of the key issues. We started out, this really is confusing. How can we clarify evaluation and what do we teach our graduate student when is we get ready to do research? Carefully define your terms.
It struck us that evaluation can really be benefited if a whole lot of people would take a deep breath, stop, and carefully define their terms. We've carved out six issues we are going to talk about with you this morning. Here we go.
The first term. Perhaps we should carefully define what in the world do we mean by evaluation?
Anybody know what this means? Let's stipulate something right up front. Evaluation is not research. There substantive confusion between these things. They look similar, but not always. They have similar methodologies, sometimes, not always. But we need to always know are we talking about doing research, or are we really talking about evaluation? These are not the same things
A lot has to do with the approach and purposes behind what you are trying to do. Because, as Howard says, the methodologies are often the same. We're doing surveys, doing tests, doing interviews, doing a variety of type data collection, whether we're conducting educational research or whether we're trying to evaluate a program, and in fact, another -- not only do I love logic models, but I like Venn diagrams. I see there can be an overlap. There are times you might conduct educational research and you are evaluating a program or project or product. They are not mutually exclusive of one another, but their purposes are really different.
At least from my perspective, a research perspective, you are really out there to create knowledge. You are trying to understand how the world works, at least how some sliver of the world works. You are trying to understand how students learn better or how to make a better widget. Trying to do something where you create knowledge, add knowledge. That's your big purpose in life.
Dr. Walters: With evaluation, the Metro metaphysical, end, the successful educational program, a quality product. We say evaluation is embedded in the process and toward the end of creating a successful educational program.
Research has to function under more rigorous and less biased sort of approach. Evaluation properly understood has the same goal as the project director or developer and designer who created this teacher program or curriculum or lesson plan, and so the evaluator is working really under the same goal as the project director. It's quite a different process. The researcher and evaluator, we may both love regression, P values, those sort things, or not. But when evaluating a program I am on your team. When I am a researcher I am on the team of knowledge. The team of conception of the world. It's quite different.
Dr. Simmons: Right. It's one of those things where it's going back to defining what your purposes are. Why are you entering that room? As program people, whether you are the classroom teacher or the lab person or the P ion a particular project, as you start to think about evaluating your program, it's important for you to think about what you want to get out of this evaluation. What are the purposes of your evaluation and that's the question to talk with your evaluator about. Really clarify why everybody is sitting in this room.
Dr. Walters: Have you ever been working on a project, extracted an important piece of research, published -- tell us about one of those things --
Dr. Simmons: I have an example like that. This is a little bit of a twist on Howard's question. Goes back to when I was a doctoral student, happened to get involved with the league of women voters in Ann Arbor, Michigan, towards the end of the 70s, 80s, the siting of hazardous waste was an issue, people were concerned about whether dumps of -- ought to be sited in that particular county in particular. The league of women voters wanted to do some programs, run some workshops on this, so citizens in the county could become better informed about the issues.
I was a young doctoral student, taking research courses, nice multiple regression, great things like that, I went into that project as a volunteer, but as a researcher. My mindset was not to evaluate the worth and functioning of a workshop, a citizen workshop; it was to think about this as a research project, while also gathering good information about the functioning of the workshop. How we ended upsetting it up, we had multiple workshops, all on Saturdays, and what we ended up doing was having some of the workshops gathered, had coffee, signed in, blah, blah, blah, got on a bus and toured the county, had some talks about the landforms, the technical issues around hazardous waste, all of those sort things, then went away.
The other groups came, had coffee, signed in, sat down, got slide presentations from the same experts who gave them the same technical information, but through a virtual field trip, in essence.
What I had done, I set up a pre-test/post-test and did a follow-up months later with the people to do a comparison between two different implementations, strategies for passing on information about hazardous waste siting.
Somebody who went in with an evaluation hat on would probably not have suggested: Let's complicate this dramatically, have two groups, make sure people are equal, matched, added all that complexity to it. They would have let's design the best workshop we can around hazardous waste. If it's a workshop with slides, grand, something else, whatever. Let's do that.
I went in with the researcher model wanting to do comparison. It ended up working great. Actually interestingly, the field trip on short-term turned out better; long-term the differences disappeared. I did -- [indiscernible] what about you, going into an evaluation setting, have there been times you added in research or actually said no, this isn't the place to bring in a research component?
Dr. Walters: Actually both. Let me speak to the former. I have enjoyed the opportunity working with our Great Lakes project, center for ocean science excellence. I went in as the external evaluator with the idea I would prepare formative and outcomes evaluations, and be embedded in the project. I have a lot of experience doing those kinds of programs, so I had some expertise and science content knowledge to understand what they were trying to do.
After the first year it became clear, because of funding issues, how the project was designed, they actually had three different models of teacher professional development in place. They were running a traditional laboratory classroom, field trip based week-long teacher workshop, teachers working with scientists, Sea Grant educators, obtaining curricula materials. Also running a ship-based teacher program working with the EPA's guardian, the late Guardian in the Great Lakes region, teachers actually residing for seven days at a time on a ship, the same core content objectives, but an entirely different platform.
Then they had a third model because of a supplemental funding opportunity, they didn't have additional educators available but had a couple of research scientists at the Great Lakes Water Institute, running a workshop for teachers in Milwaukee, the broader basin. Here was a model the scientists themselves would set up, run this workshop on their own without formally prepared educators working with them. All of a sudden I am looking at all my surveys, interview protocols and realizing we have a perfect comparison group set up here.
Over these years I found a way to measure content growth, statistical framework where we have a different model for each of these three programs. We published one paper, have another manuscript coming out this next season. All of a sudden the evaluation opened up an opportunity to do some very formal and rigorous research. Again, we were always clear on which was which, because we didn't want to confuse the audiences with the different kinds of data we were collecting.
It brings us to the second point, picking up the pace. We have embedded questions. You will get copies of these. Our talk will be guided by questions for you to ask yourself about your program. Another commercial announcement.
The second issue is when does evaluation take place. Do we understand, is it on the front end, back, all the way through? How have you seen it, observed it, and participated in the timetable of evaluation?
Dr. Simmons: I think immediately, go to visions of the war correspondent in Iraq. It feels like that sometimes as the evaluator. This notion of the evaluator being embedded in the program, I have seen every possibility there is, where an evaluator has been brought in literally at the last moment in time. Oh, I am going into the field -- tomorrow -- can we figure out a pre-test for the kid? Sure, that's a great idea.
I can talk that story.
top it.
A state which shall be nameless, Ohio -- large federal money moving into a program, state-wide initiative, certain area, I am a volunteer, external -- internal evaluator for a couple of school districts. They didn't have any money -- for paying an evaluator, so I volunteered. Figure that -- I got free ski lift tickets -- I can be bought -- but I am not cheap.
I got an e-mail two days ago, right before I left, from the contractor they hired as the external evaluator, they want to get going on a pre-survey for all of the children and all the students who participate in the program. They went to a pre/post announcement, where two and a half years into a three-year project, and two days ago they announced they are going to do a pre-survey. Wanted to know if I wanted to review the document. I said I am sure it's fine. Yeah.
Yeah.
Back at you.
Definitely back at you. I am thinking of an evaluation that was one of these farm programs. Get the kids to go out to the farm, experience real-life vegetables in the field. The people -- hey, just as the story of Pensacola, I live in Eugene, Oregon, kids never been to the farm. They picked up an evaluation, pre- and post-test, from folks in Nebraska. Nebraska Oregon, Nebraska, about the same.
That's where Mount Rushmore is isn't it?
Go North.
Their test, literally got it off the web, Xeroxed it, had vegetables we didn't grow, didn't include some of the vegetables we do grow. It was one of those -- "can you review this?"
They grow strange stuff in the West --
That's further South in Humboldt -- went there too.
Back to the notion of being embedded, as an evaluator, my happiest experiences as an evaluator, not the one where I am given the -- whatever, the midnight call for the 8:00 a.m. implementation, my happiest are when I am brought in at the very beginning, in fact at the stage when the team is still trying to figure out the program series, how they want the program to work. One of my examples is, many years ago I worked with the Chicago Academy of Sciences and they had an EPA grant to do a multi-year program with the Chicago public schools, called EcoCIT, ecological citizens. The teachers got professional development, academy, Chicago academy of sciences staff developed the curriculum materials, the whole idea, scaffolded, moved from topic to topic, year to year, all the wonderful things in the world. They staffed people, would go into the classroom, teach a lesson, over the next one or two weeks the classroom teachers were supposed to teach a lesson, then the academy folks would come back in, teach another lesson, went back and forth, portfolios, all sorts of wonderful things. It was an absolute dream from an evaluator's perspective. I was there at the very first discussions and all the way through.
We were able to talk about why they wanted to do what they wanted to do with the program, which allowed me as external evaluator to build in evaluation components that didn't seem awkward, and that were truly compatible with what was going on in the classroom.
They had various notions of having the kids produce some things through these lessons. Why don't we create portfolios for the kids. That makes sense. The parents get to see it, the teachers, ha-ha, evaluator, I get to see the portfolios, I see real products of this program as it's going through the weeks, and I am able to look at what the kids are learning, and producing.
We were able to build a program that really started from day one, went all the way through, which was -- and multiple years, because we could follow the years from second grade, third grade, fourth grade, to the extent possible in schools where 50% of the kids start the school year and the school years don't end the school year in the school.
It did allow, this is where we get to throw in more terminology, allowed for formative evaluation, as well as summative evaluation. Following things along, helping to improve, tweak the program, as well as to get that data at the end that says, well, did the kids learn what we hoped they would learn and did the teachers change their behavior in the ways we thought it would. We did have a program model. What about you?
Dr. Walters: Similar, a project I can share that ties to our next issue, a bridge project. A colleague of mine, Dr. Tina bishop, here in the room, with co-exploration. Tina and I have had the great opportunity to evaluate the impact of the national ocean sciences -- I started in 1998, Tina joined in 1999. We have been following these kids for 11, 12 years now. From the end of the first year the Ocean Bowl began, NOAA funded a continuous look at these kids. Tip Tina and I over the course of a decade have used every possible methodology we could look at, surveys, content tests, interviews, videotaping and online discussion spaces, case studies, so forth. It's been a tremendous opportunity to look at a long view of an organization, we are not interviewing and talking with people who are out of the education pipeline altogether, employed as scientists, doing things like -- on the West Coast, working for the park service, came through the ocean bowl, educated in a stim field, employed in a stim field, making a difference in this nation. To see that firsthand is really, really exciting and that moves us to the third issue. That is, the E piss-- big, big term -- what do we want to be able to say when the evaluation report finished and evaluation needs to draw from multiple types of knowledge. There are in fact knowledges, there is not knowledge. We've written about this for you. We will be brief at this point.
The opportunity we had with the ocean bowl was to capture three kinds of knowledge. Objective scientist knowledge, that type typically measured mathematically, analyzed statistically, repositivist orientation to knowledge collection. We counted everything in everybody that can be counted. You can count everything, I learned that on Sesame Street. You can count everything, and sometimes you learn important things by counting those things, but that's only one type of knowledge.
We've also captured narrative, qualitative inquiry. Does this program impact those primary participants and how do they themselves describe this experience? It's fascinating to see hundreds of young people, years after high school, who are 80%, tell us they are still in communication with their high school teachers who were their ocean bowl coaches. 80% still in communication with high school teams. Years later after they are out of high school, they spread out all over the globe. Those stories they tell us describe a social system that has emerged out of a program. That's a type of knowledge. No, not control groups, doesn't lend itself to probability analysis, but it's meaningful to a group of very high ability kids we very much want to understand and recruit into the stem pipeline. We better be listening to their voices. The third thing we did, the identification of some external authorities who could speak -- the third way of knowing, the voice of the tran sendent authority figure who knows something certainly and speaks into what we are doing. We had papers we have had blind reviewed. We put them out into that process, published several things as a result of the work. At one point CORE had an outside evaluator review our evaluation work as a quality control process after 10 or 11 years, that's a valid question. Are we doing what we should be doing? We welcomed that kind of attention. I know you do as well. Please, scrutinize what we are doing as evaluators.
We know the agencies are doing that. Capturing different kinds of knowledge because we have to speak to different audiences with different descriptions of our programs is a very important issue.
Dr. Simmons: Just as the previous issue of when you do evaluation, embedding the evaluation, this is actually in some ways the question that should be ruling your evaluation plan. What is it you want to be able to say about your program? That should help you decide what type of design you want. We were joking at breakfast -- Howard telling a story about a graduate student who is in a multiple regressions class, all of a sudden, ah-hah, that's what I want to do the dissertation on. You have a hammer, everything looks like a nail. Evaluation is apparently -- I have a survey, let's use it. I have a technique, I have -- a control group. I have, whatever.
It's really about trying to answer this question of what it is I want to be able to say, and chances are you will end up telling yourselves and your group as you're designing the evaluation, what you want some sort of mixed method. You want to, probably, collect some hard data, objective scientific numbers kind of data, so you can look at impacts, calculate difference, all of those things, but chances are you are going to also want to collect qualitative data, even if it's just to be able to fill in the story and help you interpret that quantitative data.
Those stories are incredibly important, and depending -- a lot of the valuation also relates to the context where you are. Some methods do not work as well., depending on the context. If you are working with, say, kindergarteners, probably not given them a 100-item quiz, written. Having them fill in bubbles. Maybe you are.
Things are changing in kindergarten these days, but you have look at the context. Several years ago I was working in a county that wanted to do a needs assessment, one of those when are you embedded in the system that evaluations, a program we didn't talk about, notion of front-end. Even before the program has been developed. Thinking about do we need this program? How do we deliver it if we do need it, what are the most important issues? Who are the audiences, all those great and wonderful questions.
I had been asked to go into this particular school district to find out whether or not they wanted to implement a series of reading and math programs that the state wanted to implement.
Well, in looking at this community, yes, it was a school district. No question about it. Talking to the school district people made some sense. I could have easily figured out a survey, sent in to the principals, superintendents, done all that business. I also found out this community was really a tribal community. It made an awful lot more sense to go meet the tribal elders, sit down, talk with them, have conversations with them about how they thought their schools were working. What their hopes and ass pirrations aspirations were for the schools, whether coming in from the outside made sense, how could we have it make sense if it was going to be mandated from the state.
That sort of having the narrative, the discussion, was far more important about understanding the context of that community so that I could report back to those, the powers that be about how this might work or not work. It was really a front-end evaluation rather than anything else.
So, moving to the next, most important issue, Bora, are you an iny or an outy?
What's the deal with internal and external evaluators. What are the differences, we need to pay attention to this, you want to take a shot?
Dr. Simmons: To belabor the point, external evaluator, you hire somebody not part of your program team per se, although we make an argument about being embedded, starts to blur a little. Chances are it's the classic, I have funding, let me hire an evaluator. Someone from the outside who will be your external evaluator, versus finding someone from within the program, project team or agency or group or school who will run the evaluation from inside. One of the big questions.
The first thing you ask yourself, completely pragmatic, what does the grantor or funder want? Sometimes the funder says you have to have an external evaluator, in which case, okay. Then Howard and I would say great, bring them in at the very beginning, embed them and then you have an internal/external evaluator.
I have been both, external evaluator, in the best of worlds, and I have also been the external in the worst of worlds and also have been an internal evaluator. To me, by parsing it out into internal versus external people aren't asking the right question. To me, the right questions have to do with -- actually, evaluation ethics. Are you competent, do you have the expertise to conduct this evaluation and are you going to hold to high standards of integrity and honesty. An internal evaluator can do that; an external evaluator can do that. It really is just a question of your comfort level within your program, and perhaps requirements from the outside, which may be your funder, may be school board, board of directors, other authorities.
Dr. Walters: I agree, I add to that and we will wrap and take questions in just a second so we can hear what's pressing with you. I add to that, whether you call your person internal or external, what is critical is a series of questions. Does this person have a sufficient view of the program or product, over the life span of development and implementation that he or she can in fact see what's going on, how it's happening and understand, to borrow another language -- understand the variability characteristics influencing that setting. So, perspective is so important.
Second thing, does your evaluator, whether internal or external, has this person been awarded sufficient influence over the direction of the program or product so that he or she can use her expertise and vantage point to actually change the outcome of where this thing is going mid-stream? Can the evaluator somehow, if not get a hand on the ship's wheel, at least get a hand on your arm and yank hard, so as to make a difference. From our perspective, the evaluator, not a researcher, the evaluator is on your team and needs influence to help you make good decisions.
That third piece, whether internal or external, does the evaluator have sufficient evidence to back up what she is saying? Are you willing to listen to that evidence and pay attention to it? That nails down for us the issue of internal and external. Let's -- the time is -- let's -- you have the rest of this to read, this is what, where we are, from what we've done. How do you react to this? What do you want to ask, to laser down on and talk about?
While setting up, thinking, we were -- the other two issues on the thumb drive relate to the notion of does every evaluation have to be completely unique and can you actually structure things to learn across evaluations and so issues around that.
Then, the final thing, we are both academics at heart, truth be told. We're trying to make the argument that you can learn from other programs, other evaluations. So doing a literature review would actually be a nice place to start. To keep you on track, because we -- why reinvent wheels? There isn't enough time in the world to reinvent wheels. For you to know, within on that paper we have written we listed eight or 10 resources that you might be interested in to help with some of your evaluation work. Some are just handy how-to, website, and such.
With that, somebody had their hand up.
Question: I have been messing with these little surveys, I have taken so many, I use Survey Monkey, blah, blah, blah. I am working with a school, dealing with children, even if their names aren't there, I publish anything, there legal issues dealing with people under 18 years old?
Absolutely, there are. The federal -- first, not sure if you are employed in an organization that is covered by a federal policy, you should have an institutional review board or research board to monitor human subjects research issues.
Now, the courts have looked at this question, and essentially the consensus -- it's complicated, but as a teacher you have wide liberty to test and survey your classroom children, and within a school, that faculty has wide liberties to monitor growth and learning and development of those children, for the purpose of primary benefit to that group of kids for the improvement of their education programs.
You -- one would walk to very shake I ground to move toward a publishing mindset where you are going to begin to publish the results of that kind of work without having gone through informed consent process and an IRB oversight.
The court has defined your role, a scholar researcher, it's a touchy issue.
Question: If a teacher is tied to a larger research project, clearly the institutional organization she's partnering with, the university, for example, or one of the federal agencies, would have to deal with the IRB issue. Within her own individual school district, generally a teacher, although they can do some surveys within their classroom, generally the school district would have to be signing off on anything done, even if it was only one classroom in a large urban district participating.
The other issue in a classroom situation, clearly no child can be mandated to take the surveys. How you collect the data, how parents sign off on that, and whether they are blind is something that's critical. I value all the information you are sharing. One thing I am just curious, the way to address today. We have multiple audiences in the room, teachers who are participants in program, often taking our evaluation surveys, but aren't considered in terms of giving feedback as to how the evaluation tools could be improved to capture data we are suggesting. We would like to be able to share, clearly across all the programs, whether NOAA, -- looking to find the someone stories we are trying to tell to garner support for these kinds of initiatives we are so passionate about.
I am curious how you are thinking about, you both evaluate programs across a lot of agencies, how we can move toward compiling some of these stories in some way, using common instruments, validated up in the -- [indiscernible] that would be most powerful for all of us to be able to do that for the faculty, project directors, to be able to start collecting common data. Maybe you could elaborate?
Dr. Walters: As you said, it's pretty critical. I will share a model that's emerging, COSI, and Gail Sco croft is here, she serves as national director. That project is now in its tenth year essentially. It's gone through two five-year funding cycles, you know they vary. As with most every federal initiative and program, the early years were about getting it up and running. That was the emphasize, and the emphasis on evaluation was secondary or tertiary during those early years. At this point it is a more mature program. It has gone through a couple of major funding cycles. Now, with a central coordination team in place they are making some really good efforts to do some of this standardization work. It is hard work. Gail would be the first to say it's difficult, late in a game to begin constructing some of these systemic approaches. There have been thousands of person hours invested by many people. In our COS Internet work we have developed, for example, a shared teacher survey, shared scientist survey, gone through a piloting of those instruments and a full national implementation of those instruments and will do a second full implementation of the scientists survey nationally this coming winter. It's a cough tough task. COSI is becoming a good test case, good national model to look at for what it's really going to take to do what our colleague has just suggested, to find some standard tools that we can use across the nation. That's becoming very important. We can look to a place like COSI to give ideas.
As we look at that, my sense is it's going to cost more than we think it does because we have generally partitioned evaluation into about the last thing we have thought about. I personally slash the evaluator out of the projects I have gotten funded, right? Because when it came down to three extra classes of little kids can come to the aquarium or evaluators out the damn door, then he's gone. I have done that. Right? Those are hard decisions to make. On a national level we have to quit doing that. We have to quit doing that.
I think we are going to have to systemically look at a different organizational structure in Washington for evaluation of programs and products. I know NOAA is looking broadly across the whole NOAA line office structure at a way to standardize evaluation for the whole NOAA system. NSF is involved in that conversation. I am optimistic some of these large programs can be researched as models to learn to do evaluation more effectively. It will take time and sustained funding beyond the programming funding years. When the program ends this year we can have three, four, five more years to follow this thing. That's what I would say back to that sort of comment. Other questions?
We're around all weekend, so --
Question: Hi. First, thank you for giving this talk. I heard you guys, I was saying wow, they are describing me. I am taking GLM, I am the kid with the hammer, everything is a nail.
I am at the resource level, we, not just me, my department, and we are, our job is to promote scientific literacy, and to help teachers K-12, and we developed a professional development. As educators we always think, well, what we are doing is the right thing. What we are doing, we want to help teachers, my heart is at helping teachers. I was a teach arer a long time myself. But you are saying evaluation creates programming. We have been doing this four years already. Is it possible that evaluation can be conducted on going programs?
Yeah, absolutely. We tend to think in terms of evaluation, somebody puts forth a grant, create something new, evaluate it, move on in the world. But there's absolutely no reason, and one could make a really good argument, because you had a program up and running for a while, it's a perfect opportunity to take stock. When we were having our discussions it was that notion of the evaluator is -- the better angel sitting on your shoulder whispering in your ear. If you have a program up and running, I think it's right for a formative, as well as evaluative, informative not being in that project program development sense of formative, but more the monitoring. How is it working at each stage?
You may have a really good sense your outcome is happening in the ways you want it to happen, but what are the different pieces along the way that make that work? Or are there parts that aren't working as well as they should? Maybe they need to be tweaked. That's where the evaluation can look at your program and help you monitor it as well as do some of the summative at the end to measure or calculate the impact at the end. Collect some of the stories, collect them in a systemic way. Evaluation, we all -- I ran resident outdoor Ed centers for many years. I have thousands of stories of the kids, often named Jason -- who was "the kid." the teacher would come in, say "you gotta watch Jaso ." , Jason was great, a little devil in the classroom, but at the resident center, Jason wasn't tied to the chair, was wonderful. I have thousands of those sorr stories. Not evaluation. It wasn't collected systemically.
You have a successful program, but now it's a question of systemically documenting what works, what might need a little improvement and what you are getting at the end, some of the impacts.
All good stuff. We are getting to the wrap-up stage. Something came into my head at this point -- the issue of how you fund this evaluation toward the end or at the end of a long-term nearly finished program when it wasn't in the budget up front. You didn't share that part, but maybe you're thinking I have a great program like that. Third, fourth year, wasn't evaluation money, I would love to do this.
Let me give you an idea, my last two Septembers for the day maybe. If you look -- we talk about looking at other literature, pulling other ideas, here's a factoid full of statistical thingies. From another piece of research. There's research done on brand-new, junior assistant professors, the number one problem that brand-new assistant professors report as new on the tenure track professors, obtaining access to places to do research so that they can get themselves moving toward this publishing requirement. At my institution, if you don't have six national peer-reviewed publications by the end of the sixth year, you are done. It's pretty serious.
All across the country there are young junior assistant professors telling researchers, I am lost, I need somebody to throw me a Lifeline where I can have access to something I can research, study. Money is nice, but these folks have salaries, yes, other obligations, but there's a pool of junior faculty out there who need things to study. You are a group of people who have things that can be studied.
It seems somebody should introduce the two of you somehow so you could get to work together. That's just an idea, find a way to build those kinds of bridge and connections also.
It was fun talking to you today.
[Applause ]
Thank you again, that was wonderful. I wanted to reiterate a couple points. Bora and Howard are going to be here all weekend, at least Bora through today, Howard through the weekend, lots of other times to talk to them. We are about to have a roundtable discussion shortly after the break where a lot of you will get to talk. Bora and Howard are two of the facilitators out of the groups we have.
In tab 3 of your binder there's a list of facilitators, the programs they are from, tables they are sitting. We would like to fill the tables please, try not to leave empty spaces. The discussion questions are in tab 4.
The way we are going to work this, the round table sessions, we're going to spend half the time talking about the first half of the questions. The second half of the time talking about the second half of the questions. The facilitators will guide you, someone at the table will volunteer, be very enthusiastic about taking notes on a laptop, and we will