MICHAEL HENDRICKS: Yes, here is where we left. That hopefully, however many objectives you had, lets say you had 8. You had evaluated their implementation on each of the five areas. And then you had done an overall implementation. Well now so what, so what. So this one comes out low. Well this next isn't especially rocket science but you have to remember we produced this information in order to use it; right. That's the putting the mirror up there, we are trying to use it. So here is what we simply say. If the overall implementation for an objective, any of the objectives at all, and that's the overall score, that's the one at the bottom; right. If the overall implementation is high or medium, well, first of all congratulate yourselves, you have done a good job, nothing wrong with feeling good. See which of the five aspects scored the lowest. Let's go back to this one. This one didn't quite score high enough. Which one scored the lowest, I guess activities scored the lowest didn't it. Activities here scored the lowest because there's none yet. You can see which one scored the lowest. Look for ways to improve those aspects, no doubt about that, we would want to look for ways to start making those activities happening, obviously. And we also might want to look at that participants too; right. That scored pretty low. We might want to find ways to work on that. And then also, keep watching in general for any problems, ways to improve, better ideas. Because for example, that staff high, what if you have some turnover, you know, two or three people turning over and boy that staff high could go to a low just like that. So you can't stop looking. But that's the nice situation, is if the overall implementation is high or medium, but what if it is not? If it is low or none yet, first of all, rethink. Now this is the overall now, the overall is low or none yet, it is a serious problem, and you just can't ignore it. This is a serious problem. And we think it would be good for you to ask two important questions, first of all why isn't the overall implementation better than this and in this case, might be pretty obvious that we don't have any activities happening yet, okay. And then what can we do obviously to improve it. So you focus in on the areas by acting immediately and decisively. So I would say again the participants and the activities would be here. Nothing fancy about that but we just don't want you to stop at this page. If you stop at this page, you have collected the data but you haven't done anything with it. That's not the spirit of evaluation we want. The spirit we want is do something with it, make your program better. So this is if it is high or medium. This is if it is low or none yet. A couple of details, you still aren't quite in my opinion still not quite ready to go out there and evaluate implementation yet. So here are some details you will have to figure out because there's no way, can I even begin to suggest how you should answer these questions. But I can definitely say you've got to ask these questions of yourself. First one who is going to evaluate implementation? And that has come up. You have already said that. You know, how can I in the SILC evaluate implementation. Our answer has been hopefully you don't, hopefully you do it collaboratively with other people. The lead organization for the objective, now what does that mean? You remember last year in Las Vegas you saw this diagram. We drew this. It is in your handout in case you can't read it up here. What we said is that quite appropriately for the IL philosophy, consumers are at the center of everything, they're there and we are proud that they're in that middle circle. The immediate ring right around them are what we might call the key IL partners. You, DSUs and CILs. That's what we would say would be the key IL partners, but there's also another ring around that. In many states this other ring is pretty darn important. So you have got other state agencies, councils, service clubs, other public and private entities, local agencies, schools, and you could probably add a whole bunch more. So there are a lot of people who might be involved in achieving an objective. And what we said, you remember we said last year I am sorry, Darrell. We are one minute away from a moment of silence. So let me finish this slide and then we will be silent for a bit if that's okay. Just to finish this slide we said that last year we said last year that somebody ought to be responsible to be the lead organization to take charge of making an objective happen. It is important for the buck to stop some one place, and it might usually be the SILC or the DSU or the CILs but maybe not always. So the key thing was to figure out who is the lead and what we are saying here is possibly the lead organization should be the lead evaluator or possibly not. It's a good question. We are going to get you started on this task though, I do. Because we have had good progress going on this issue of the leg of implementation. But there is this detail that who is going to do it. Now what I was just saying is that maybe, maybe the most appropriate group to evaluate implementation is the group that is responsible for making that objective happen. Or maybe that's the least appropriate group. Maybe those are the last people who ought to be responsible for evaluating implementation. I don't know. Maybe a different IL partner, okay. So for instance, maybe if you -- I guess you are not, but let's say maybe somebody else is responsible for implementing it, maybe they're the wrong person to evaluate it or maybe a different organization all together or maybe a contractor. Maybe you want to contract this out, I don't know or my preference maybe a group process, okay. Maybe you do it collaboratively. Second detail you have to figure out, how are you going to do it? How are you going to evaluate implementation? Are you going to call somebody up and ask them what they think? You and your colleagues? Are you going to do site visits to different places in the state and hand out checklists? Are you going to do observations? Are you going to look at files? I don't know. I don't know how you are going to do the evaluation of each implementation. I am guessing it might be different for each objective. You know because each objective is different; right. So you probably have to do it a little bit differently. The method you have used might be a little bit different. And then the last one, the last detail is how often should you evaluate implementation? How many people think you should wait three years before evaluating implementation? Yeah, neither do I. How many people think you should evaluate implementation every week? I don't either. So what is proper? I heard somebody yesterday, it was really good to hear some reports yesterday. That every quarter at their SILC meeting. Who said that? Someone said at their SILC meeting every quarter they talked about how things were going. Who said that? Do you remember? Was it you? You don't think. I think that's ideal perhaps, annually may be too often, I don't know. These are the three details, who, how, and how often. Okay. Your turn. Here is your assignment. On your table, you see this? Okay. There's one on every table. This will look awfully familiar. It is just what I put up on the screen. It has got the SPIL objective, who will evaluate implementation, how will we evaluate implementation, and how often will we evaluate implementation and then its got the five key aspects and it is how well implemented. So here is your task. As a group, as a group, at your table, one of you talk about an objective from your SPIL. That's the first thing. You have to all be on the same page about the objective. Have that person describe the situation, a little more detail so you can know what's going on, okay. You've got to have a sense of what's happening here with this particular objective. Discuss the who, how and how often and fill that in, you know, there's a place right here, who, how, how often, fill it in, evaluate the implementation, that will be interesting to see how you all or if you all feel the same way around the table. Evaluate the implementation of each, on each of the five aspects and then overall just for this exercise, do the overall down at the bottom and then discuss how you came to these ratings. I want to hear when we come back what you think of the process of doing this. Any questions? AUDIENCE MEMBER: We might do the front table, Michael if that's all right. MICHAEL HENDRICKS: There you go. The front table is volunteering. AUDIENCE MEMBER: We are the show-me state. Everybody in this is from Missouri. And our objective is from our state plan that our SILC chair brought. Individuals with disabilities in Missouri have access to information and referral services. MICHAEL HENDRICKS: So this is very different. This isn't targeted on the partners at all. This is targeted on the persons with disabilities and it is their knowledge of, their awareness of having oh. . . AUDIENCE MEMBER: Of obtaining information and referral services. MICHAEL HENDRICKS: Okay. Great. Tell the group what happened. AUDIENCE MEMBER: Okay. Who will evaluate the implementation and that goes back to the three partners the CILs, the SILC and the DSU. We don't have a contract for gathering information that's the three entities' responsibility. How will we evaluate the implementation. In Missouri we have a SPIL quarterly evaluation tool that we turn in with our QSR. So that will be done on a quarterly basis. How often will we evaluate the implementation, that's quarterly. And then our five aspects we may -- MICHAEL HENDRICKS: Can I stop you for one second. AUDIENCE MEMBER: Sure. MICHAEL HENDRICKS: That part about how you are going to do it that sounded a little vague to me. AUDIENCE MEMBER: The SPIL. MICHAEL HENDRICKS: I have to push you back a little bit on that. AUDIENCE MEMBER: That's fine. We have a SPIL quarterly evaluation tool. That was developed with the SILC and DSU that all centers, all of the CILs report on a quarterly basis and evaluate all of our goals and objectives. MICHAEL HENDRICKS: Your objectives. AUDIENCE MEMBER: Yes our objectives on a quarterly basis. MICHAEL HENDRICKS: How are they, but the information here is information that you got to get somehow from persons with disabilities; right, because you have to find out what they're aware of and not aware of; right. How are they going to get that information? AUDIENCE MEMBER: Actually, we are looking to increase the number of people using the I&Rs. MICHAEL HENDRICKS: Ah, using the I&Rs. Slightly different objective, same ball park. AUDIENCE MEMBER: So we are looking at, I guess numbers to see if those numbers are increasing. MICHAEL HENDRICKS: Okay, so if -- well, we have got to draw a distinction here between implementation and the objective itself. We will talk this afternoon about the objective itself and whether you are achieving the objective. We are just now talking about implementation of it. How are you going find out if it is being implemented, you going to interview people, you gonna call people, you gonna look in some files, what are you going do to find out if the it is being implemented. AUDIENCE MEMBER: We are drawing numbers from the CILs that they're getting and reporting and so that is where we are making that evaluation of the numbers are increasing, then we are achieving that objective. MICHAEL HENDRICKS: We will talk more about this this afternoon. I did not mean to stop you. So then you have got some ratings there. AUDIENCE MEMBER: Resources high, staffing high; participants, high much like the table over there, our activities because everything that was involved in trying to do that, we identified four activities, two of them are currently active. So we said medium. Management high, and overall implementation high. MICHAEL HENDRICKS: Okay. Okay. So this is an objective that you feel is being implemented quite well. AUDIENCE MEMBER: Yes, at this time. Okay. AUDIENCE MEMBER: We made that discussion, that point as you did, that this is something that because it is high now doesn't mean next quarter or a year from now, that it is still going to be so that on-going process in our case quarterly has to be constantly evaluated and looked at. MICHAEL HENDRICKS: Okay. Good. Thank you very much. I appreciate that, table. You have shown us. We have a question here. You want to go next. Let's see, any questions or comments about that one. You want to go next. Go ahead. AUDIENCE MEMBER: Okay. This is Regina from Texas. The SPIL objective is an increase of transportation services to consumers by the network of CILs which include 1,617 services in 2011. 1,649 services in 2012, and 1,682 services in 2013. MICHAEL HENDRICKS: So basically persons with disabilities have more access to transportation. Okay. AUDIENCE MEMBER: And who will evaluate implementation. MICHAEL HENDRICKS: Implementation of it, yes not whether in fact they are having more but the implementation of your efforts to try to make it happen. Who's going to look at that? AUDIENCE MEMBER: Okay. I put the state association. MICHAEL HENDRICKS: Oh interestng. She puts the state association, they're not in the circle at all, are they? Is that correct? AUDIENCE MEMBER: Correct. MICHAEL HENDRICKS: So now why, you pick -- you gave the assignment to someone out here. AUDIENCE MEMBER: It is still the CILs. MICHAEL HENDRICKS: State Association of CILs. AUDIENCE MEMBER: Uh-huh. MICHAEL HENDRICKS: I am sorry, I misunderstood. AUDIENCE MEMBER: I put the state association and then the individual CILs not everyone is a part of the association. And then also the SILC and the DSU. MICHAEL HENDRICKS Okay. So collaborative type of effort it sounds like. AUDIENCE MEMBER: Correct. MICHAEL HENDRICKS I like that. AUDIENCE MEMBER: And then how will we evaluate? We have a monthly reporting tool called 3160 that the CILs must complete. We use that information to transfer all of our numbers to make sure that we are staying on target. And so from a monthly report and there's also surveys that go on and we have a quarterly SPIL monitoring tool that we use as well. And so will report monthly and quarterly if we miss something monthly we get it at the quarterly evaluation. MICHAEL HENDRICKS Okay. I think I need to ask you to stop for one second because I think I have failed miserably at something. AUDIENCE MEMBER: Oh no. MICHAEL HENDRICKS I think I have failed miserably and I need to stop us and see if I can correct it. Let me go back once more. Okay. We really, really, really need to have in our head the difference between implementing all of those things that have to happen in order that we have a chance even that an objective is going to get achieved. That's one thing. That's the implementation. And the other side is progress or how we are actually doing on that objective, okay. So there's a difference between implementing and actually it getting better. Let's take an example. Let's try to make up an example. Say it is this training program. Say it is this training program. What is the implementation of this training program? Well, we have got to have a room. We've got to have people. We've got to have a facilitator and we have got to have some materials and we have got to have time. Think of all the stuff you've got to happen in order to make the training -- one second if you would. To make the training session happen. Okay. That's implementation. That doesn't necessarily mean you've learned anything. Maybe I have done an awful job. Maybe the materials, you know are crummy. Maybe you have no interest in this topic, completely different; right. So we can implement something really, really well or as best we think we can. But it doesn't necessarily mean our objective of you being better at evaluation is going to be achieved. They're two very different things. So all we are talking about this morning so far is that first part. It is the implementation, how did we put it here? We had a phrase here. Making happen all the things that need to happen in order to achieve all the different objectives. So making happen all that stuff that needs to happen, okay. How is the process of doing this kind of an exercise? Did it seem to work for you or to be something you can do when you go back home? Or not. Feel free to say not. AUDIENCE MEMBER: I did, I just wanted to make sure I was doing the bottom part right but I think I get it. Okay. MICHAEL HENDRICKS: I think you do too. AUDIENCE MEMBER: Okay. MICHAEL HENDRICKS: Yeah. AUDIENCE MEMBER: I guess I was sitting here with an ahhah because maybe it has to do with impatience or something, but I always -- well I guess for the first time I have understood that the distinction between impact on the outcomes. You know, that I always saw the SPIL as reporting on how we are doing on outcomes. MICHAEL HENDRICKS: Okay. And part of it is. One of the three legs of the stool is definitely on how you are doing on a certain level of outcomes, which is that lowest level that we are calling objective. AUDIENCE MEMBER: Right, but what Regina was just doing is exactly what I thought of as what an evaluation of the SPIL should look like. But I am seeing it differently, it is more, it is more of a report on – I don't know how to say it. It is -- It almost seems in some ways like it is another step in the process of saying how we are doing on the outcomes. MICHAEL HENDRICKS Remember this stool we are talking about has three legs. So far we have only talked about one of them. So don't jump to any conclusions yet about what we mean by evaluation because we haven't talked about the other two legs yet. AUDIENCE MEMBER: Okay. MICHAEL HENDRICKS This evaluation we are suggesting you do has three different aspects, three legs of the stool and two of them we haven't talked about yet. We are only talking about so far one leg of the stool. AUDIENCE MEMBER: So that means by the end of tomorrow I will be really confused? [ laughter ] MICHAEL HENDRICKS You will be brilliant. AUDIENCE MEMBER: Oh good. Well, that I knew. [ laughter ] MICHAEL HENDRICKS: But I think it is a true indicator of success that by lunchtime, at least one person has had an ahha moment. You are getting there. All right. And I am -- I don't want to necessarily go to other tables but I would like to hear a more general discussion if we could about what you think of this whole thing of this one leg because as I say, you know there's the leg of implementation. There's the leg of progress on objectives that you are talking about there. And there's the leg of consumer satisfaction. Remember those three legs. Forget the latter two, we haven't talked about them at all yet. We have only talked about the first leg of implementation, but this is all we are going to say about the first leg of implementation. So this is your time to talk about our suggestions for evaluating implementation. We have one back here. AUDIENCE MEMBER: This is Florida and Tennessee, and actually we kind of had a question. When Florida was working out, working this through, it started to make more sense and we kind of figured out how to try to incorporate what the centers for independent living are already doing in their reporting mechanism and how they could get, because a lot of them are doing what we have as far as our strategies and activities and objectives and how to incorporate so it is not putting more work in for what they are already doing. But I had a question that is like Tennessee for right now only receives Part C money. So they have very little input into or the DSU doesn't do anything with them. So the centers already report directly to RSA because there's no Part B money, no I&E money, no Social Security, general revenue, any other source is of funding. So when you are looking at this process, each state is done so differently that it is very difficult to try to plug one way into this. So how would a state for instance like Tennessee who has no ability to receive anything from their centers or their DSU even begin to evaluate the SPIL? I am speaking for you guys. I'm sorry. MICHAEL HENDRICKS: A great question and relevant to all of this. I just want to point out a guiding principle we mentioned early on was evaluations will vary from state to state. By no means do I expect every state to do things the exact same way. You are saying, here is a case where things are going to have to be done quite differently just because of the local situation. You know I am not expert enough in IL to know exactly the implications of all you just said but it certainly sounds like they're going the to have to do things differently for sure. That's fine. That's fine. We are kind of like the generic model. Remember the generic logic model yesterday for the SILC, there's going be some variations from this. Do the variations you need to do for your own state. AUDIENCE MEMBER: To me, this first leg of this tool is incredible because I think that we all have a tendency to write objectives in our plan that are long-term and overwhelming and sometimes seem impossible to achieve. And then when you try to evaluate the implementation of your plan, you think we haven't accomplished anything and look it is terrible. But if you also look at what have you done to try to get there, that can be, kind of reenergizing to keep building on accomplishing objectives. I have learned two things today. One is that I think we are all still writing objectives that are at least lofty if not impossible to achieve. So I think we need to bite off smaller pieces so to speak. MICHAEL HENDRICKS I would echo that having looked at SPILs, the -- I was looking for lower level outcomes that flow straight out of activities so I could call them objectives and I wasn't finding a lot. AUDIENCE MEMBER: yeah. But that's what I am saying. But I think it also is really helpful to look at how to evaluate what you're doing on the ground to try to get there. So you have some accomplishment regardless of whether you're making as much progress as you would like to on actually achieving objectives. So thank you for that. MICHAEL HENDRICKS That's what implementation is about. Yeah. AUDIENCE MEMBER: That was -- Ann's comments were very helpful. Part of what I think is difficult for those of us in states where there's not a lot of skin in the game on the part of the DSU is if you look at the way the regulations are written, there's an assumption that the DSU is a significant player and investor in independent living services, and in fact all over the South with the exception of Florida, it is just simply not the case. And so that is I think what makes it hard for us to try to get the other players because quite honestly in our state, about 15 years ago the DSU just quote gave independent living to somebody else. MICHAEL HENDRICKS So I think you are saying in this circle it's the SILC and the CILs and nobody else. AUDIENCE MEMBER: Pretty much. Now, what has, what it is that we are trying to do is to get some reinvestment on the part of the DSU and that's coming along a little way, but it's this notion of, if you don't have skin in the game it's very hard to get players to the table. So you have to resort to charm which can be a problem. [ laughter ] MICHAEL HENDRICKS Not in your case. AUDIENCE MEMBER: For some of us it is a problem. So, you know, it is just a reality, and I think that those of us in the southeast need to probably work a little bit together to figure out some of those kind of things. The other thing about what Ann said is true is that when we look at the realities of people with disabilities in our states especially in states like Georgia that doesn't invest a lot in any kind of disability services, I mean the fact is we are not the Olmstead state by accident. It is a fine and long-standing tradition of locking folk up and ignoring the federal government and ignoring Supreme Court decisions and blah, blah, blah, blah. So it is -- what's worrisome about trying to bite off manageable pieces to this is that the real issues that confront people with disabilities every day, nobody is handling because it is too hard to do. And so that's the nerve racking part about how do you sustain the real vision about what needs to be going on and also deal with the manageable pieces so that you can tell if you are getting there or not. MICHAEL HENDRICKS Okay, good point. One here. AUDIENCE MEMBER: I am Julie from Arkansas part of the southeast. And actually we have a good relationship with our DSUs and I would kind of like to know if, what this lady is saying is true by a show of hands. I mean do most of you have problems with your DSUs? Do you really? Well then I live a lucky state, I guess. Oh, well, it is really not even about money for us. We feel like they're -- MICHAEL HENDRICKS Those weren't half -- those were quite a few hands but I don't think they were more than half of the hands. Very much split. AUDIENCE MEMBER: Good. Hopefully more people will end up on our side of the getting our money and things being taken care of. My question is, this is a great model and I am excited about it but I am a little concerned about the time that will need to be invested in it and do you have any suggestions as to how to manage your time so that people continue to come to the table. We meet on a monthly basis. Our SPIL committee does, which consists of our DSUs, our CILs and our SILC and then we report on a quarterly basis. But too many times that turns into a three or four hour meeting and the next time around people are saying I can't commit to that. So do you have any suggestions how to do this much work in a less consuming. MICHAEL HENDRICKS I have a suggestion. It may or may not be a good one and you may come up with a better one. My suggestion always when people talk about the difficulty of getting people to focus is to structure the task. So I think if you use a table just exactly like this and instead of the numbers 1, 2, 3, 4 across the top you write in there the objective, the language of the objective. You can even send this to people in advance of the meeting, you can even do the whole thing by e-mail, you can do whatever you want. But structuring the task and staying on task will, I just find saves so much time. So yes, that is a suggestion. There may be other, better suggestions in the room. But that's one. Over here, we have a comment. It is almost 12:00, I try to honor your time. So let's take maybe one more. AUDIENCE MEMBER: I was going to say another way to break it down is to break the objectives down into committees, rather than having a committee that their responsibility is the whole SPIL. What we have done is we have formed a committee around each objective and then they come up with their list of activities and we evaluate based on what they do and that way you don't have a committee trying to look at the whole SPIL because it is so much. That's just one suggestion. MICHAEL HENDRICKS I like that because it fits into the whole objective by objective by objective by objective approach that I think you must take.