1 Involving the Community in Decisions: Data Gathering for SILCs. Presented by Judy Sharken Simon and Brad Williams. >> OPERATOR: Good afternoon and welcome to today's conference call. Without further delay I will turn your call over to Tim Fuchs. >> TIM: Thank you, Lisa. Good afternoon, welcome to our first national teleconference and webcast: Involving the Community in Decisions: Data Gathering for SILCs. I'm Tim Fuchs, operations director at the National Council on Independent Living and a staff member of the SILC NET project, a program of the IL NET project, the CIL NET and Statewide Independent Living Councils, the SILC NET, and the IL NET is operated by the Independent Living Research Utilization program, ILRU, at Memorial Hermann/TIRR in Houston, Texas in partnership with the National Council on Independent Living, NCIL, in Washington D. C. and the agency of programs for rural independent living, APRIL, in Little Rock, Arkansas. With substantial support for the development of this program was provided by the Department of Education, RSA, under Grant No. H-132E070003. And I have to say no official endorsement of the Department of Education should be inferred. I have just a couple more very short announcements before we begin the substance of today's call. First of all, please be aware that we are recording today's call. It will be archived on ILRU's 2 website after the fact. Also, all telephone lines are muted and when we begin our Q. and A. sections, you will be able to ask questions by simply pressing 01 on your telephone keypad and you'll be placed into a queue in the order in which you dialed in. For those of you participating by webcast today, you should see a question submission form at the bottom of your webcast screen. And all questions submitted will be relayed live on the call. Those questions do come automatically to me. However, anyone participating in today's call can send questions directly to me at tim@ncil.org. And I'll relay them to our presenters. The materials for today's call including our PowerPoint presentation and an evaluation form are located on our website at -- I will give this web address twice. The U. R. L. is a bit long. Excuse that. Www.ncil.org/training/ICM.html. These materials should have been sent to you ahead of time by E-mail, but just in case, one more time, www. ncil.org/training/ICM.html. Please take a moment after today's call to fill out the evaluation form on that same page. It only takes a few minutes to complete. It's very brief. It's very, very important to us. We do review all evaluation forms that we receive and they weigh heavily as we prepare for our future presentations and training. So without any further ado, I want to turn it over to today's presenters who I also want to thank for their hard work in preparing today's presentation for you. And today's presenters are Judy Sharken 3 Simon, manager of board services at MAP, the Management Assistance Program, and Brad Williams, Executive Director of the New York State Independent Living Council. Without any further ado I'll turn it over to you, Judy. >> JUDY: Great, thank you, Tim. Hello, everybody. As you know, our topic for today is Involving the Community in Decisions and Data Gathering for your SILCs. Side No. 2, and I'll be referencing the slides as we go along if you have it in your format. And if you're on the web, this our cue to change the slides. We'll look at why bother with community input? We'll spend quite a bit of time looking at some different methods for gathering input and some of the pros and cons of those methods and how you can use them most effectively. We'll hear from Brad about his experience gathering input and involving the community and his use of some of these various methods and how he put them to work. We'll finish our call today looking at what might not work about them, what could get in the way of these working effectively for all of you and what are some resources you can tap to use these different methods well in your work. And then, Brad, Tim will come back in and give us some information about next steps. Slide 3, some of our goals for the session, making sure that you understand the primary data gathering methods, know the advantages and disadvantages of each method. Know how to use them appropriately in 4 order for your efforts to be most successful and make sure that you are aware of the importance of consumer and other stakeholder involvement in the decision-making process and that these methods are vehicles for you to gather that input. Slide No. 4, Brad is going to take us through that. >> BRAD: Thank you, Judy. Community input, why bother? Well, it's a good way to involve people and create investment as well as good business practice and it creates buy-in. I think it's also very important because as a SILC, we very much want a legitimate state plan. I mean, that should be a high priority. At affirms or denies assumptions and allows us to make more informed and better decisions. And certainly this is something that I think is a high priority. While it takes more time up front, it definitely speeds the process up down the line. And then it promotes two-way communication and collaboration, and I think we know from our state plans they prioritize collaborations. Slide No. 5 -- commonly used methods for gathering input, and there is a few such as interviews or surveys, community forums, focus groups or even E-mail comments. Now, Judy is going to discuss these in more detail for you. >> JUDY: And these are -- >> BRAD: Slide No. 6. >> JUDY: Slide No. 6. These are the ones that are most commonly used. So there are other methods of gathering input, these are the ones we thought would be most useful to all of you and the ones that 5 would be worth getting more information on. So this isn't an exhaustive list of methods, but they are probably the ones that most people think of when they want to involve the community and gather data or gather input. So we're going to start with interviews. And I use a similar format for each of these methods. So you'll start to recognize kind of the flow, but typically interviews -- when would you use those? Think of it as before when you're planning, when you're trying to gather ideas for program design, when you want some up front advice. When I was revamping a training program before we started in terms of redefining the content, we interviewed people who ran other training programs nationally, so we could kind of get a feel for what we might want to do in terms of revamping the training program. Sometimes you use interviews after you've done something or after a program for assessment or summarizing or reaction. So maybe you have a day-long staff retreat and you want to interview key management about how did it go? What did you think of it? So that you can gather that feedback. Slide No. 7, some of the most important things to remember about interviews is that they are really good for more in-depth conversation because obviously in an interview you can probe, you can say what do you mean by that? Could you give me an example? You see, you can read facial expressions so you can see their reaction and say you seem a little uncomfortable with that. Can you tell me what might be going on? So you can certainly get more in-depth information in an 6 interview. They are time-consuming though. It's very difficult to do an interview, either a phone interview or a face to face interview in less than an hour, hour and a half, especially if you're using it to really gather the kind of information that you've deemed an interview useful for. It's really -- interviews are particularly important when key relationships are at stake, when you really want to find out what funders think or something -- a major donor or a political leader. You really want to have that face time or at least that one on one time with that particular individual so that you can help build that relationship . Slide 8 is some 69 down of the down sides of interviews. They lack the synergy you can get through some other kind of group data gathering method like a focus group or community forum. Because it's on a one on one situation, you don't get people influencing each other, bouncing ideas off of each other, so you're only getting one person's perspective most often. We said in the earlier slide it's time-consuming. So that's certainly a downside because you can't get as many in. They take longer. The kind of resources you need to devote to doing interviews is more than you would experience in some of the other methods. Limited quantities are possible, if you think of interviews in comparison to surveys where you can mass mail to hundreds of people. You simply can't -- usually don't have the resources to interview that 7 many folks. So you are really limited in terms of the number of people that you can tap and get their input via interviews. It's also fairly difficult to standard eyes and quantify the responses because it's people's perceptions, thoughts, both how the notes are taken as well as the key words that people use. It's just difficult -- more difficult to standardize the responses and come up with something that says 30 percent of the people that we interviewed felt this way. And that's much easier obviously in some of the other methodologies. On slide No. 9, we're going to move to surveys. So interviews are one method. A second method that I think is probably most common is surveys. And typically those are used before you embark on something, market research, how many people value our service as it is now might be one example. Sometimes surveys are used in an ongoing way when you're looking at doing a name change or recruiting new clients. You may ask people's opinion about a projector a program that's currently in process. And, again, surveys can be used at the end of a program or service. So after in terms of an evaluation such as we'll use today. How was the workshop? So kind of a postmortem look at things. On slide 10, some of the most important things to remember about surveys is that the design of the survey is critically important. So how the questions are configured can really determine whether you get accurate information or not. So if it's a poorly designed survey, there might be a lot of confusion as people go to answer those 8 questions, and then your data is compromised. Quantitative analysis of the results is one of the benefits of surveys, but it also means you need people who can do that quantitative analysis. So you need the skill set to do that analysis, otherwise, the value of really doing it as a survey instrument is lost. So what do the numbers tell you obviously is something you're looking for in the survey format. Surveys are particularly helpful in reaching a broad audience because you can cover large quantities of people, and there is very few of the other methodologies that can really do that in a way that a written or an electronic survey can. On slide 11, surveys also have their downsides. Response rate is one of the most cited reason people don't use surveys because a good response rate in the industry is a 2 percent response rate. So as you think about it, that's pretty poor, at least with interviews you're pretty sure of getting 100 percent of people participating in the people that you select. There is also very little opportunity for probing. Most of the time it's electronic or a written survey and people answer yes/no. You can't determine what was behind their yes or no. Why did they answer that? Maybe they were confused by the question. Maybe their interpretation of a particular term was different than what you had intended. So you don't have that opportunity to explore further about their responses. They can certainly be costly to administer. As I think about 9 written surveys, there is a printing and a mailing cost and, again, one of the reasons you would use surveys is when you're trying to reach a large quantity of people. So, again, the printing and mailing costs that go with large quantities increase, and that's an expense that you need to consider. And online survey instrument -- Brad, you've had experience with on line survey instruments. You want to share about the downsides of that? >> BRAD: They have trade offs, and they have extreme benefit, but you know to start with cost, they can vary. You might find one product that could be along the lines of around three dollars a month or some other ones that are far more exorbitant, but certainly you're going to pay for them with a credit card and you're going to get hit monthly. So the cost could, you know, really go on and on. So you better like the product because it's going to go for months. And you know certainly if you're going to want to hold on to the product and survey for awhile and keep your data, you're going to want to keep that in mind. So you want to keep the cost and budget it and make sure that you cut it off if you need to, because otherwise, it's going to be secured with a credit card. There is also a training curve involved. Unless you have someone who is very savvy with these types of products, you are probably not just going to be able to jump online and to be able to utilize these products. You're going to have to invest some time to go on and learn these products and be able to develop the survey instrument and be 10 able to be able to plug in the surveys and to then be able to manipulate the data as it comes in and put it into the product as it's put into the results. It's going to take time and you're just going to have to realize that that's going to be invested. And last for our population, accessibility. You cannot assume that for both -- on the using end as you're designing these instruments and learning the product, that it is accessible. And certainly on the user end that it is accessible. You know, we utilize survey monkey and people had different experiences. What we did was we had three testers who once we finished the survey were willing to test the final product to see if we like the speech readers they were able to utilize the survey instrument. Two were able to navigate through it. A third was able to get through most of the questions, but when they were open-ended questions, they had problems with the dialogue boxes. So, you know, you are going to have to realize that a lot of these online survey instruments do have trade-offs and be aware of them. So slide No. 12. >> JUDY: Yes, so we've talked interviews, surveys, community forums is the third mechanism that we wanted to give you a brief overview of. When I think of community forums, I think of something that's for large groups, a public setting where you have stakeholders of 30 or more people. So it's a fairly large undertaking. Typically community forums are used to collect opinions, beliefs, attitudes about issues of interest to people in the public, to your stakeholders. 11 I worked for an organization called the Wilder Foundation in Minneapolis, St. Paul, and they were very interested in getting involved in neighborhood development efforts, and they wanted to go out into the community and hear from neighborhoods about what should their role as a human service agency be in terms of neighborhood development. So they undertook creating community forums in many of the neighborhoods around St. Paul where they felt they wanted to get service in those neighborhoods or provide service in those neighborhoods. So they were really undertaking a process of collecting beliefs, ideas, things that were of interest to people in those communities. It's certainly about building energy, ideas, excitement about a topic because in a community forum versus surveys, you have an ability to tap into that emotional context in the setting. So you've got a group of people that can build off each other that you can create a sense of energy and excitement by how you set up a room, how you decorate the space that you're in, the kind of speakers you have there, and the energy of the participants themselves. Another area where community forums are often used is about providing an opportunity to learn more about a topic because in that kind of setting, you can have an expert or an educational speaker. You can give written materials or have other vehicles for people to pick up information. So there is an exchange that can happen that isn't really as available in some of the other methodologies. On slide 13, some of the most important things to remember about 12 community forums are the logistics. There is lots of details when you're talking about a group of 30 or more in a public setting. There is lots of facets of it that you need to be prepared to address. So secondly, the orchestration of that event can really make a difference. You need good facilitation. I hear in Minneapolis there was a situation a couple of years ago where we were -- Minneapolis was closing a number of public schools and they were having community forums to hear the community's reaction to that. Emotions were high, as you can imagine in that kind of public setting, the rules for engagement were unclear, and pretty quickly things got out of hand and so they didn't get the information that they wanted because they hadn't really orchestrated those events or thought ahead about how to orchestrate them well. But they are a great opportunity for P. R. because you have an opportunity for that two-way exchange. But you really have in any of these methodologies, but community forum in particular is a good vehicle forgetting the word out about your program or service or agency. Slide 14, some of the downsides of community forums because of all those logistics in the public setting, though logistics can be overwhelming. You need good staffing. You need people that are prepared to be there, to do the set up, who have skills doing that. Secondly, the group can certainly take on a life of its own. So I was just and attendee at those Minneapolis public schools events, and maybe they had planned to orchestrate it very well and the group kind 13 of took over because emotions were so high. So that can happen in a community forum versus some of the other methodologies where you have some more control over the setting and the group reaction. Lastly, capturing the data can be challenging. There is often lots of people there. How they are sharing information, how that information gets written down or captured, either through audiotaping or videotaping, that can be a challenge in terms of how do you gather the data in a way that you can utilize it later. Any comments, Brad, about community forums that you've had experience with? >> BRAD: Yeah, I would say that for me the facilitation notes provide the direction. That's kind of consistent with what you were stating with your example with the school setting. And I'll be speaking to that later in terms of a facilitation outline and how I feel that may be help to provide direction. >> JUDY: All right, thanks. Well, in a little while here we'll be opening up the lines to hear your experiences with some of these methodologies and confirm or deny the things that I've been mentioning here. We're going to move on to slide 15, which is a fourth method of focus groups and one that I have particular passion about. I have written a book on the subject. And so I really love focus groups. And my definition of a focus group may be similar or different than what you're thinking of. When I think of focus group, I think of a definition that Richard Krueger out of the University of Minnesota 14 developed in his work on focus groups. That it's a group of 7 to 10 people who are together in a facilitated discussion to share beliefs, opinions, ideas. So a couple of key things about it, if you notice, it's a small group. Seven to ten people. They are there to share beliefs, opinions, and ideas. So there is that synergy component and it's a facilitated discussion versus a public forum where it may not be as orchestrated or as guided in terms of the questions that are being asked. Typically, focus groups are used before -- again, in the planning program design and market research when we were developing hear about a year ago some advanced board -- nonprofit board governance training, we used focus groups to hear from savvy experienced board members, what they wanted and needed in terms of training and education. So we did that before we undertook developing that course. Ongoing -- the same as with in other instances where you used this methodology in an ongoing way, name changes, recruiting new clients and after again -- focus groups can be used to assess things. Tim could have the option of pulling together a focus group of the number of people who are on this webinar to say what did you think about that, rather in addition toll evaluation survey at the end or in place of or to gather more information than what's revealed by that written survey. Summarizing, postmortem, to gather people's ideas about your image, those are all instances when you might want to use a focus group. 15 On slide 16, some of the most important things to remember about focus groups is that notion that they are there to capture people's ideas, beliefs and feelings. So they are not very good for gathering issues where you want to quantify the responses. They are also particularly helpful in combination with other methods. I gave you one example of the evaluation after today's seminar that we could gather the evaluation in a written format and then follow it up with focus groups to probe more in-depth about particular issues. They are great at creating opportunities for fun, engagement, interaction. I always -- when I do a focus group -- I try to weave some fun element in there. I put people in teams in the focus group and ask them to develop a radio ad. That helps talk about what the mission is. So there is opportunities to do some creative fun facilitation in focus groups. I'll stop and take a drink here. The downside of focus groups on slide 17 is that you do need somebody that is skilled in group facilitation to lead them for the same reasons in the community forums, it's a group of people that can take on a life of their own, if not well facilitated. Also time is limited, so generally when I do a focus group, I think I have about an hour and a half. That's enough time in my experience for five or six questions. So the number of questions that you're able to ask in a focus group is compromised versus, again, a survey or an interview where you can go down different paths. You can ask different questions. You can cover a hundred questions in a 16 survey, and in a focus group, you've got much more -- much fewer opportunities to ask questions because part of what you need to do is take some of that time to set up the environment. You may start with a very generic question, getting people's feelings, perceptions about an issue and slowly work into some of the more critical questions about the data that you're trying to gather. On slide 18, the last method we want to cover is E-mail comments. Typically it's used in collecting opinions, beliefs, attitudes, where you know people have strong opinions about things that sending their comments via E-mail is a very quick, efficient way to get that information. During the course of public comment period is a great time to use that E-mail data gathering mechanism because it's quick, it's immediate, so you can get that information. And certainly again after a session, a project, a plan for feedback. What did you think of this? What are further thoughts on this that we need to consider next time around? So E-mail comment is definitely a vehicle that's helpful. On slide 19, some of the most important things to remember about E-mail comments, Brad, you want to share those? >> BRAD: Sure. Definitely be prepared for large volume, but be prepared is what's important. And you must set up your process beforehand, and you know, it is important to be accessible and have a controlled medium and I'll explain. Two state plans ago we set up the process to be able to receive E-mail comments for our state plan. And we only received a few 17 comments, but the process was set up and we had the ability to receive those comments. For this past state plan, everything was in place, and we received almost 400 comments by E-mail. So, you know, you need to work out the process and be ready. Be prepared. Because you don't know what's going to happen, but you know eventually probably that second time around you hope that you will get overwhelmed because that is what you are planning for. Slide 20. The downside of E-mail comment -- well, it is difficult to analyze the data. You lack an ability to probe or bounce ideas off of one another. The responses tend to come from those most passionate, pro or con. They are coming in randomly. You know, it's not interactive. They are coming in from all these different sources, and some responses come from targeted or organized efforts to support popular initiatives as opposed to these random comments. Now, let's face it, you know, we are the masters of this with systems advocacy and action alerts. There should be no surprise, but you know this should be welcomed. You know, this is extra input and it's only going to help us have a better state plan. Slide 21. >> JUDY: Thanks. And again we'll hear more from Brad's experience about what worked and didn't work and where he got surprised. But regardless of the method that you use, there is eight steps that you should be thinking about to really effectively use any of these methods. Step No. 1 is defining the purpose. And I encourage people to 18 write it down, be very clear about why you're wanting to gather this input. And you need to be very clear about it so that you can communicate it to other people, to participants, why you're doing this, why you're holding a community forum, why you're sending out a survey, why you want to engage in an interview. It also helps you develop the questions that you're going to ask, because if you're not clear about the purpose why you want to gather the information, the kinds of questions will probably lead you down a different road and the answers then of course will not be what you're looking for. And that's the third bullet, is getting the information you really want. You don't want to spend a lot of time, money, resources asking questions and then finding out that the information does not tell you what you wanted to know in order to make effective decisions. A couple of examples of purpose statements: The first one -- these are examples that I put out there as initially poor examples and how could we reframe them and rephrase them to make them more explicit. So if your purpose was to gather input on service needs in the disability community, the question you want to ask is why do you want to know that? Well, it's to find out if the top priority needs in the state are adequately addressed in the state plan, or adequately covered. So really, if you look at the purpose statement as it's originally stated, it may not exactly say that, and the information you get may take you somewhere else. So a clearer purpose statement might be to gather input to ensure that the service needs match with 19 what is outlined, the components that are outlined in the state plan. The second example, if your purpose for holding a focus group or doing a community forum is to hear what our constituents think we do. Well, why do you really want to know what your constituents think you do? Well, because we want to rewrite our mission statement. And so I've experienced those with groups I work with. Well, you could get a whole lot of information about what constituents think you do and it might -- may be it will have something to do with your mission statement, but if you were clearer to say to hear constituents' ideas about our current mission statement and how to revise it, again, your data gathering, the questions you ask, the invitations you put out to participate in that method will be much more focused and much more effective. That's step 1 to take the time to define the purpose of your data gathering. Step No. 2 on slide 22 is to establish a time line and this slide is an example of the time line that Brad used in his data gathering. You want to step through that, Brad? >> BRAD: Sure. In fact, we have here something that went along a seven month time line and for me in perspective, I always found it interesting because I would go to the SILC Congress and, you know, there would be some colleagues who might not even start their process until after the SILC Congress, which typically would happen in January, and you could see how busy we would be before we would even get ourselves to the SILC Congress. And so, you know, it would happen 20 in May where we would start off, you know, with a SPIL, a state plan committee, you know, to make decisions about the process, and then by August, you know, develop the important outreach materials so that by September we would distribute these outreach materials to the network and post them on the website so that by early October we'd start with the statewide public hearings and late October start the sessions, breakout sessions at conferences so we could turn it all around in November and then meet face to face with the SPIL committee to review all this increased input. And that would be our time line. >> JUDY: And as you can see and hear from Brad's description, it's really about backtracking. When do you need the information, what do you need it for and give yourself -- plan backwards so you have enough time to carry out the method effectively. On slide 23, steps 3 and 4, identify and invite the participants, where are you going to get the names of people to invite? How are you going to invite them? Thinking through those pieces, and generate the questions to be asked. I use a four-part model for generating questions. I take time to just brain storm. What are all the possible questions we could ask in this survey or this focus group. I prioritize those questions and then a really important part of that is testing them. So whether it's with co-workers or even to myself in a mirror, if somebody said what do you think the most important thing about this strategic plan for this organization is? I ask myself the question and think how would I answer that and is that going to give me the information that I need to do the thing that I've set out to do 21 with this data gathering method. So steps 3 and 4, identify and invite the participants, generate the questions and take time to tells those questions. And then revise them accordingly. Slide 23 and 24 -- excuse me -- slides 24 and 25 which we'll show you in a minute is the example from the New York SILC. So I won't ask Brad to walk through this, but just for you to scan it. This was a facilitation outline that they generated to show you the kinds of questions that they were asking in their process. >> BRAD: Now with that, really I felt that the facilitation outline helped to guide the input and process, no matter what mode or venue we used. And what was key is that the questions were developed to gain important feedback on key aspects of the state plan. And you know as Judy said, if you scan some of the key questions, you can see right away by what was underlined that it corresponds with the use of the Title VII Part B funds or the use of Title VII Part C funds, or independent living -- the scope of independent living services. >> JUDY: Slide 25. >> BRAD: Which is on slide 25. Everything corresponds and we asked questions that are going to direct, you know, important -- get important feedback that's going to really direct important content towards our state plan. >> JUDY: Just an example for you to briefly scan. On slide 26, step 5, depending on what method you use, you may need to develop a script if you're using a focus group or an interview. Select a facilitator, choose the location, making sure that it's accessible, 22 which not only means in terms of the disability piece, but also is there parking? Is it an easy place to find? Is it a comfortable setting? So those are things depending on the method that you may need to do in step 5. Steps 6, 7 and 8 -- you do the method. You do the focus group. You do the community forum. You put out the survey. And then you've got to interpret and report the results. So you need to -- the important piece of any data gathering is to bring back the information to those people who participated. People love to hear how the information that they generated got written up, got summarized and how it's being used. And lastly, you need to translate those results into action. So whether it's revising the plan, doing follow-up, being clear on what the next steps are, how are you going to put this information into action? On slide 28, some of the questions you want to ask yourself is in determining which method to use, what kind of information am I trying to obtain? Is it quantitative information? Is it people's feelings, beliefs, perceptions? Who would conduct this kind of research if you really don't have access to good facilitators, don't do a focus group. Don't do a community forum. Who would be participating? Is the people who really E-mail comment is a much easier forum for them to increase our participation than doing an interview or having a community forum that they may need to get to a location. So now I want to just open it up and, Tim, I think you're helping facilitate this piece on slide 29. What works for you? Questions, 23 comments out there? >> OPERATOR: If you have a question at this time, please press 01 on your telephone keypad. Again, if you have a question at this time, please press 01 on your telephone keypad. There is one question from mason. Go ahead, Mason. >> CALLER: Hi, I'm from the southwest center of independent living in spring field, Missouri. And I was -- just wanted to inquire how you work around group think and some of those particular situations where one idea might get blown out of proportion and the whole group gets fixated on that which permits other ideas from getting in the conversation. >> JUDY: In a group setting like a focus group or like a community forum, that's what I was specific in saying you need good facilitation. That's where facilitator techniques come in. Saying -- I use for example in a focus group something called a nominal group technique. Where I ask people to write down their responses and then I go around the room one person at a time. So I'm real controlling how people are sharing that feedback and if one person goes off on a tangent, I'm able to say with my facilitation skills, that's a great comment, Jerry, we want to make sure we hear from other folks in the room, and so it really has to do with some good facilitation, being able to stop a person or redirect the efforts. It's a magic answer, often standing physically closer to paren communicates some of that -- it's time to let somebody else come in and encouraging other responses. How many of you agree with Jerry? Anybody else have a 24 different opinion? So you're encouraging the group to think more broadly. >> BRAD: I know that the facilitation outline that we utilized allowed input on a wide variety of different issues while it also correlated with the state plan. So unless people chose not to give any comment on those issues, that might be one way that people could not participate, but chances are they did -- you know, people did respond. You know, it was a way so that people wouldn't get stuck like you're saying or people could get beyond just the group think. So the facilitation outline worked well for us in different settings and venues. If we didn't have it, we probably could have been stuck like you have indicated. >> CALLER: All right, thank you. >> OPERATOR: Are there any other questions at this time? If you have a question, press 01 on your telephone keypad. The next question comes from India Anderson. Go ahead. >> CALLER: Can you hear me? >> JUDY: Yes. >> BRAD: We can hear you, go ahead. >> CALLER: (no audio). >> JUDY: Nope, we lost her. >> OPERATOR: The next question comes from Mary. >> CALLER: Hi, the question is what about incentive to participate in the survey or in the forum or even in the -- I'm 25 thinking food will work really well, or in the focus group? Does it color the outcome? >> JUDY: Brad, you can talk about your experience with this. My experience is that food is always a nice thing to have. It breaks the ice. It sets the tone for certain populations, it is an incentive to come to the event. And so food is always a good thing. I try to encourage incentives to be minimal. I found in my work with communities and in nonprofit work that it doesn't make or break somebody being there. They are there because they want to give input and they were asked. So I try to keep the incentive something that's not expensive to the organization. So a coupon to a training or a mug or something that they already have. I've used drawings for incentives to fill out surveys. So a 25-dollar Target gift card, if you fill out the survey, your name will go into a drawing for it. That does increase participation. So on the participation side, incentives can help. Does it color the outcome? I haven't found that. Brad, you want to comment? >> BRAD: Yeah, I mean, we piggy back off of existing events which, you know, you already have people who are there. So, you know, you already have the audience. I mean, that typically helps and you already have folks who are interested in the issues. So the interest factor is already there. So it isn't like they are not interested in the topic. So, you know, that certainly helps. You know, in terms of some of the surveys, it's a matter of just seeing -- I think it's more or less working out some of the access 26 issues. Because again I think the interest is there, or the digital divide issues, in making sure that your samples are decent. So, you know, I'm looking at other issues really, and I don't necessarily think that it's coloring the results so to speak. I'm just trying to make sure the samples are representative. And I think that people are turning out -- we ended up getting more response to our state plan -- you know, we had to take these extra efforts in order to -- as you'll find out in the latter part of this -- to system mat particularly go through the information than we ever dreamed off. So I think once you put out these methods, you will end up getting the information. People may have more of a vested interest into it than you probably expect. It might be the surprise. And that's not such a bad thing. They are showing that they care, you know, if they have an interest in it, that might not be such a bad thing. >> JUDY: Tim, do we want to take a few more questions? Or should we move on? >> TIM: Yeah, I would like to. Let's take two more, and I have one. It looks like only one from the webcast. Lisa, are there any more telephone callers in the queue? >> OPERATOR: Yes, sir, there are three more -- four more in the queue. >> TIM: Okay, let's take one of those questions and then I'll read my question. The others, we will hold until our final Q. and A. session at the end of the call. >> OPERATOR: Your next question comes from Teresa. 27 >> CALLER: My question is regarding the E-mail comments. You mentioned that between two different years you increased your comments from like 2 to 400. And we were wondering what did you do differently to get a better response? >> BRAD: All I know is that we set up the system in the first round, you know, so that it could be received, and we put the word out and for some reason individuals just -- just didn't choose to respond that way. I know that maybe within our network we just had more advocates and more individuals who that within about a three year time span were ready and very willing, you know, to respond. Now of course during that same three year time span we ourselves had expanded and talked to other non-IL Net works and had done extensive out reach as well and had gotten the word out better during that same three year time span. But certainly, you know, it just made it so that people responded and they just, you know, made their effort. A lot of them were, you know, at the same time were advocates. There is no doubt about it. People were making their best effort for their programs of choice that they would like to see in the state plan, but at least, you know, they were legitimate responses which is better than having no response at all. So I would guess it would be people just finally speeding it up and making the effort. People were advocating. I guess that would be my response. >> TIM: All right, thanks, Brad. I have one question that I'll 28 ask quickly. Are a few minutes behind. I apologize we're not able to take all the calls from the telephone queue. We will take your questions. Please remember them or write them down. My question comes from dean at Oregon Health and Science University in Portland, Oregon. Dean has written in from the webcast and asks being a rural state in Oregon, we would like to hear strategies to engage rural populations. And Brad, I'll start off with you and then we'll see if Judy has anything to add. >> BRAD: I'll talk a get more extensively about this later, but one advantage that having that facilitation outline provides is you can take that facilitation outline, and then you can put it into a packet, and you can send it out to your -- like centers for independent living and/or other providers who you find might be beneficial, and give them the instructions of how to run a local focus group so that they can have kind of like local control and run their own local meeting and get that same input based on that facilitation outline. And that way you erase transportation barriers. In a more rural setting, you can get that input and then send that information back to the council. That was one of the other methods we used. And some people took us up on it, and we thought it was effective. >> TIM: Okay, Judy, anything to add about engaging rural populations? >> JUDY: Just, you know, thinking through which method really provides the greatest access to those populations, so interviews -- I 29 mean what comes to mind is interviews, surveys, electronic formats where getting to a central location is not going to be the barrier. So looking at some of the other methodologies that you can reach out and the geography won't get in the way. You can probably even do it via some kind of webcast, a community forum over the web. >> TIM: Great. I apologize for putting us a couple of minutes behind schedule, but I did want to take as many questions as possible. Again, we'll have another Q. and A. session at the end of the call. Brad, you want to proceed with one SILC story? >> BRAD: Sure. Okay, well, for us, it was a transition over several state plans. >> JUDY: We're on slide 30. >> BRAD: Yeah, slide 30. From 2 to 3 sparsely attend hearings and the state plan partners hashing out the pile, to an empower SPIL committee developing the process and defining modes of feedback and venues, and greatly increasing involvement in the process and input into the plan. Slide 31. And I believe, you know, it all centered on the SPIL, the state plan for independent living committee. They played a key role. They were very efficient at completing work tasks, like developing the facilitation outline. They also developed a handout for like accomplishments. Like if we were doing outreach and no one had any idea who the state council was, and you would want to educate folks. You know, we had this handout of things we had accomplished in our different areas over the past dozen years. So we felt that was 30 important. We also had developed a public hearing schedule. Which also gave people an idea of different ways they could tap into our process in terms of the modes of giving us feedback. I think the SPIL committee also made important decisions when necessary, such as how we were going to summarize all this increased input and we needed this extra comment period on a preliminary SPIL draft. And they were very good at stepping back and letting the process work. Slide 32. How did we achieve expanded input? Well, besides the three statewide public hearings at centers, other methods included breakout sessions at four statewide conferences, you know, especially to gain a non-IL perspective. And focus groups sessions at the local level and written comments submitted by individuals online via the SILC website. If you notice, I kind of put down for each one of those the different things -- the modes of information that we received. Like the facilitation notes, or audiotape, or the summarized notes or the comments that were received. I also felt it was important to summarize some of the rationale for why we did some of these, such as the breakout sessions, the nonIL input or for like the focus group discussions, making sure we got past the transportation barriers or the fact that some people, you know, prefer close group settings or the advantage of local control and empowerment. We're trying to get past even digital divide, if that happens to be the issue. And certainly for the Internet -- E-mail 31 comments, taking advantage of that increase in demand and usage you know for the comments, especially this last time. Slide 33. People drive priorities. Overall, the facilitation outline helped to structure the feedback. In the SPIL committee tallied the feedback in relation to support the various initiatives. I felt people and public comments defined the priorities and with the priorities identified, the SPIL committee matched them up with budget amounts. And then a preliminary SPIL draft was sent out for public comment. It turned out to be, I felt, a very systematic and Democratic process. You know, I was impressed. We had all this additional public input and we found a way to process it and then assign -- process it, prioritize it, and assign budget amounts to it. Slide 34. The final push -- only a few comments were received on the preliminary SPIL draft. Very odd. And appropriate changes were made. You know, was this a fluke? Lack of interest? Or a sign that we had done our job? The final SPIL draft was sent out to SILC members in January 2007 for review, and then in February 2007, at the SILC meeting, the SPIL draft passed with minor edits. No major debates, arguments or filler busters occurred at the meeting. It was amazing. Slide 35. What worked? Concepts of participation, ownership and legitimacy and I said this right at the beginning, really, it was an investment of time to gather data and input up front that saved confrontation later. I can remember having state plan meetings where 32 we would argue endlessly on sections of the SPIL, and it was just pleasure to have a meeting where besides minor edits, this state plan went through. And that's because the real battles occurred during the committee work. And I think it was just benefited by just having all that input through the process and it just felt legitimate. Slide 36. Other NYSILC surveys. We worked on many other surveys as well. Every two years we do a statewide CIL consumer satisfaction survey. We've done a statewide CIL technology and equipment survey. We did a statewide housing needs for people with disabilities survey. That was kind of limited in sample. The statewide needs assessment survey related to our funding priorities. Just recently, we did a focus group testing of ballot marking devices. New York State of course was sued by the federal government and these were the ones that were in the court order. And then we done voting trends of New Yorkers with disabilities with the Siena Research Institute and Zogby International. Panel 37. Judy. >> JUDY: Yeah, so what might get in the way? What might be some of the barriers? I know it's obvious, you don't have the money to do these various methods. We know already from Brad's description and you can imagine it takes up front time, so you need to build that in. Access to people. I want to open it up at this point for some of those additional questions that came in and other things that as you think about getting more community input, what would stop you from 33 doing that? What are some of the things that are still lingering out there that you have questions about? >> OPERATOR: If there are any further questions at this time, please press 01 on your telephone keypad. First question comes from Sheila. Go ahead, Sheila. >> CALLER: We're kind of coming from a different angle. We're a community and what we're asking is how as advocates in the community do we get our state CIL to open up for public comment? >> JUDY: I'm going to defer that one to Brad. >> BRAD: I'm not sure I quite understood the question. Would you mind restating it? >> CALLER: How do we as community members encourage our state SILC to gather public comment and welcome public comment? >> BRAD: Okay. They really should. I mean, I think it comes back to the question of legitimacy. If you want a legitimate state plan, okay, you should be gathering public comment. If RSA is approving a legitimate state plan, it should have public comment. You know, I told you where we used to start many, many years ago. It used to be about having two to three very boring public hearings across the state that were sparsely attended. You know, it was hard for people to attend, and you got very little public comment, and then, you know, a few people probably -- a person on the SILC to write the state plan. What kind of legitimacy is that? >> CALLER: Well, that's what we have now. >> BRAD: That is no way to write a state plan. 34 >> CALLER: That's why we tuned in to y'all today is to see -- and I think we've got some really good ideas how we make our SILC accountable to our community and getting information. I just wanted to know if y'all had any other ideas. >> BRAD: I think -- I think what's exciting is to get the information from your community and to hear what people want and to then prioritize it and then be able to legitimately say, look, this is what people want, and this is what we're going to do. >> JUDY: Maybe starting with your own data gathering and pushing that information forward to the SILC to say, hey, we held three focus groups and here is what they said. >> BRAD: How can anyone argue with that? >> CALLER: That's a great idea. That's great ideas and we appreciate it. >> BRAD: And you know what -- you got my contact information. If you want any help with the process of how to set up the process or how we pulled it off, I'd be happy to help you. >> CALLER: I've got your -- I've got it here and I will be calling you. >> BRAD: Very good. >> CALLER: Thank you. >> OPERATOR: The next question comes from Tony. Go ahead, Tony. >> CALLER: Contradictory opinion, what tools do you use to prioritize which opinions to incorporate? >> BRAD: What we did was we had to find a way because a lot of it 35 was coming in from a variety of different sources. So how do you take -- how do you weigh input from a focus group, input from a public hearing, input from a breakout session, input from E-mail, input from written comments equally? Okay? Well, you do. You have to weigh it's equally because it is all the same. So we would have to then take it and we prioritized it all the same and ranked it all the same and then that's how we then ended up literally assigning it and then saying, okay, what ends up happening is for each item that got basically the most, you know, endorsements was then what got the most weight, which then got back in the plan, and then it was a matter of then trying to allot budget to it. >> CALLER: So am I understand Derr Staning you saying you're doing it on a quantitative basis? >> BRAD: That was the only way -- we did it by committee. And by committee, that was the way we felt was the fairs way of doing it. >> CALLER: Okay, I'm thinking in terms of people in the different subgroups within the disability community, differing on how an issue should be addressed. And how do you resolve some of conflicts? >> BRAD: That's how we did it for this particular state plan process. Now, you're right. If you want to talk about -- let's talk fairly about -- let's take it out of the state plan process. You're saying how do you talk -- like you're saying like a disability and a disability caucus forum, how would you prioritize issues? >> CALLER: Correct. >> BRAD: Okay, that's a good question. That's a very good 36 question. Because we've had -- we've held some disability caucuses, and that's excellent because how do you prioritize a disability agenda? Because it's hard enough to assemble the disability agenda, but that is a tough one. >> CALLER: But that's essentially what a SPIL is is putting forth an agenda? >> BRAD: Well, the SPIL is a little bit different though. What I'm hearing you say though is, is that a disability agenda -- you might have issues like employment and health care and transportation and voting and you can have a whole series of different things. >> CALLER: Right. >> BRAD: How do you prioritize those? And housing -- that would be tough. And maybe you might literally almost have to vote -- I mean that would be tough to kind of figure out that one. You might have to figure out a process for it. Or once you've identified the issues, have people actually vote on the issues, relative rank of the issues. But, yeah, you're right. I've been at some disability forums and I've seen people take different approaches to it. Sometimes what they do is -- they've kind of cheesed out of that one and they've identified the issues and then sent it to a work committee to -- away from the forum -- to devise the agenda and people behind the scenes develop it and that's no way to do it. Any time they pull it out of committee, they are out of the forum to work behind the scenes, then it gets done by a few. And that's not fair. >> JUDY: I'll just chime in here. Part of it is using some of 37 these methods to help you do that prioritizing. So polling a group of people from various entities together in a focus group or via a survey to help prioritize the issues. So you're using these same methods to get that input about what should be the priorities. >> BRAD: That was an excellent question. What was your name again? >> CALLER: Tony, from the Arizona SILC. >> BRAD: Tony, how are you doing, Brad Williams. >> CALLER: Thank you for your help. >> OPERATOR: The next question comes from Brad. Go ahead, Brad. >> CALLER: This is Gayle and dean. There is a group of us here including Brad, but I have the question. In a situation -- let's say we've held a public forum and we've gotten just a ton of public comments and now we're sitting and looking at a stack of public comments. Do you have any advice on how to sift through that, pull out themes? What do you do with it once you have it? >> BRAD: Yeah, that's part of -- the bridge we had to cross. All I can tell you is what we did. You know, people could approach -- you know, there is different approaches to organizing, and people could certainly choose a different way of doing it. We just found ourselves kind of surrounded with so much information, this was the only way we could do it. We literally just found that we had to sift through all of it and literally assign like this matrix of assigning every time youth leadership came up, we would do a tally for youth leadership. You know, because every time like someone would give comments, they 38 might comment on five issues and you never knew like what five issues they might comment on. And we would go through and do a raw tally of what it was they were supporting and it was the only fair way we could find we could do it. And then we'd go on to the next person's comment. And go through until we had -- in an exhaustive way -- made a tally of what it was people were supporting and then finally had a raw score of what everyone had endorsed, basically put in for, and in an amazing way we had this -- you know, this correlation of what everyone had said they wanted, but it gave us a true picture of what they prioritized, and that gave us that legitimacy of what we felt comfortable about. And that's what we did. It was tedious and it took a long time, but that's what we came away with. >> JUDY: And that's a good comment. The information crunching, if you are out there to gather community input and you're successful, you get lots of community input. And so then you have to crunch that information and there is some subjectivity in deciding what are the categories you can divvy up some of the responses. Brad had a very active and involved committee that engaged in that and took the time to do that; but you have to commit to not only doing the process of gathering the input, but then sifting through it and developing the categories and you will make some choices about what the categories are. >> OPERATOR: The next question comes from Richard. Go ahead, Richard. >> CALLER: Hi, I had a question regarding ensuring that 39 throughout the information gathering process you include minority groups, specifically like groups that might be E. S. L. that I'm thinking, Brad, you might have some good response being that you're out of New York. >> BRAD: Right. In terms of where -- we have representation on like the council and also in different -- you know, I know like there are certain centers that did focus groups that also had that as well. Like, for instance, I know one of the New York City centers in I think like the Utica center did focus groups. And Utica is a clustered center that includes others, and Amsterdam is heavily Spanish speaking and I know that the same down in like New York City, they have multiple languages. You know, including not only like Spanish, but Mandarin and French creole. There is oh many it's -- I can't even keep up with how many. But we get like representation on the council, too. So, yeah, we always try to -- we try including -- you know, we have a 121 project representative on the council as well. So we even have someone from the Native American -- one of the reservations out in western New York. We try to achieve diversity many different ways in the public input. You know, geographically, ethnic, cultural, language, cross-disability of course and nonIL. >> CALLER: Just a follow-up question. You had spoken about sending out packets with guidance on folks for doing or facilitating those community forums. Did you do things in like different languages where you sent it out to some of these groups where they could do it 40 at that local level with their constituent group and get the information back to you guys? >> BRAD: You know, I'm trying to think what we did with -- we may have worked with them and followed their lead in terms of what they needed and I'm trying to think of whether we did or not. I don't think so because they would -- either they certainly didn't say anything to us and I think that they would have probably -- they probably would have taken care of it themselves. I know that when we do our CIL consumer satisfaction survey, we take care of the multiple languages on that. So, you know, I think for like this particular survey, those particular locations I would imagine would have determined the need and taken care of it when they took care of facilitating it at the local level. Because I know that I didn't generate it. So, you know, that would have been up to them. I mean, both communities have that particular need, but, yeah, I just know that I have to do that for like our consumer survey because it's a statewide survey. I know I didn't generate it from here; but part that have is I put it in their hands and they facilitated in the local level. I can find that out for you. >> CALLER: Okay, just one last piece on that question kind of going in the same area around that underserved disability populations, did y'all in planning your strategies for collecting data, did you develop any strategies to try to tap any of those populations that fell into those categories to ensure their involvement? >> BRAD: Yeah, we went to like some of the different nonIL 41 communities. We tried -- there is one particular group we wanted to try to get to, and that was we had it in our state plan to -- you know, with the Iraq and Afghanistan war and the veterans who were coming back, you know, we wanted to try to connect IL services to them. And in our particular state, the New York State brain injury association happens to have -- have obtained a large contract to work with the veterans. And so we made sure that we were able to work with them try to connect out to them. They were one particular group that we knew we needed to get to. You know, it was one way that we could get -- they were definitely an underserved population and the New York State brain injury association was one way that we could network with a particular nonIL group to get to that unserved population. But there are other, you know, other nonIL groups that we reached out to so we can get to some of these other traditionally nonIL populations. >> CALLER: Got you. Thank you. That's an awesome idea to collaborate with the B. I. group. That could work in our area as well. >> TIM: This is Tim. I'm going to jump in real fast, Brad. Richard, thank you for the question. I know most SILCs are doing this, but I just want to point out, Richard, those were great questions and if you're not already asking yourself if your council is representative and asking those questions of yourself, you should be. So thanks. I also, too, Judy if I could ask of you another good question that came from Tony in Arizona, would you mind going back for a 42 second? You began to speak about taking the results of your surveys and then sending it back out to the community and asking them to prioritize that. Would you mind expanding on that? >> JUDY: Well, just that part of thinking through gathering community input on any of these methods is that you combine them. So you could start out with a written survey or an electronic survey to figure out what are all the issues out there. And then do focus groups to help prioritize those issues. So I've used it in establishing a training click rum for a nonprofit management. What are all the things people want to learn about in nonprofit management and gather all the information via survey and then hold focus groups to say, okay, which ones -- let's talk about all of these. Let's talk about some of the pros and cons. What are you looking for? Or interviews with particular key populations if you really feel like they are under represented or you want to hear their particular viewpoint. So its ease just being creative about thinking there are ways to combine these methods to gather the input and utilize the community and maybe more ways than you had initially thought of not only getting the information, but helping sift through it. >> TIM: Thanks. I want to say before we take another question that we are low on time. We have just two minutes left and because we are using a C.A.R.T. writer to caption this presentation and create the webcast, we cannot go over and so I want to point out that I was to review the resources that can help which is slide 38. I'll simply refer that back to you all to review. It's a list of three books and 43 several websites and including one book on focus groups by Judy herself and also an outline of Brad's process that's on the New York SILC website. And so I'm sorry we don't have more time, but I really do want to get to as many questions as possible. Lisa, is there anyone in the queue? >> OPERATOR: The next question comes from Darnise. >> CALLER: Yes, we would like to know what recommendations could you provide us with about diversity within data collection and driving priorities? >> JUDY: Diversity in terms of who participates? How you ask -- >> CALLER: Diversity as it reflects within the data? You have like the voice of how to be more inclusive within the data collection? And then how you set your priorities based on a diverse voice? >> JUDY: Well, I can maybe speak to the first part and look to Brad for the second part. I mean, part of it is the up front work that you do to figure out who do we need to hear from and how are we going to get access to those populations? So as Brad already talked about, partnering, thinking who your partners who have access to those populations, are they under represented? Do you need to make special efforts to have some special focus groups with particular populations? So really that's part of the up front work you need to do before you go out and gather the data, is figure out who you need to hear from and how you're going to get to those folks? In terms of the diversity of the input you receive and how you sift through that and what priority you give it, do you have any 44 thoughts on that, Brad? >> BRAD: I'm going to make a quick plug here. We're about to engage again in the fall with, you know, a poll -- a pollster, probably Siena Research Institute or Zogby International. And this would be a good job for a SILC, because it's noon partisan election poll to look at voting trends of people with disabilities. You piggy back on one of their nonpartisan polls. They will be able to ask a question like, you know, do you have a medical condition or disability that impacts your mobility, hearing, sight, cognitive or mental abilities? And they insert that into their voting trend poll and then they can ask all their other questions and they get a subsample of voters with disabilities. And they can then ask all the other subquestions such as, you know, things about people's gender, age, race, ethnicity, income, everything else that the pollsters ask, you know, can then be put into subsets for you. And all you do is you're paying for the question. And you get everything else analyzed and you'll get the voting trends of people with disabilities. It's well worth it. You are literally just paying for the -- the question for the existing poll that they are already going to do and you can -- for our last one, we found that 9 percent of the voters poled in the sample were voters with disabilities. And that's a pretty significant thing. And then you get a subset of that demographic group and you can start analyzing, you know, the trends of those voters with disabilities would be a good start. >> CALLER: Outstanding. Thank you. 45 >> TIM: Brad, I'm sorry, I hate to do this, but we are out of time for today. So I cannot believe how quickly that 90 minutes flew by. We probably do have some remaining questions in the queue. I want to make a commitment to those people still waiting to ask questions to please forward those questions to me at tim@ncil.org. And I will make sure to get a response to you. Also on slide 1, your introduction slide for the PowerPoint presentation is contact information for Judy and for Brad. They have generously supplied that to you all so you all can follow up with them. Please, as one resource for follow up for questions, please feel free to E-mail me those questions. We'll make sure to respond. A couple of things extremely quickly, I want to thank everyone for joining today's call. We really appreciate you being a part of this. And especially I want to thank Brad and Judy for their hard work in designing today's presentation, putting together the PowerPoint presentation and responding so thoughtfully to all of your questions. Please visit our training page. I don't have time to give the url again. However, I gave it at the beginning of the call. It's the same one you used to connect to the webcast. And you get the telephone number for the teleconference and at the top of that page is our evaluation form. It is an online survey, how about that. But we have found that the website that we use to host the survey is completely accessible. And Richard Petty and Darrell Jones at ILRU have offered to be a point of contact. I want to thank them for that. 46 If any of you would like any more information about the host, especially in regards to its accessibility, or if you have any comments or concerns. Again, you can contact them through ILRU's website or contact me and I'll put you all in touch. So thanks again everyone. Any lingering questions, don't hesitate to pass them along to me. Whether it be in five minutes or in five weeks. Thanks again staff and presenters if you could hold the line. Everyone else, please stay in touch. Thank you.