I've been helping a new-to-us group, the Education Program for Gifted Youth (EPGY), asses integration with the Stanford Sakai deployment. They would like to use
Coursework to support their distance learning offering, the Online High School. This is the only of our Stanford clients for whom Coursework isn't a supplement to F2F classroom instruction.
Our first meeting ( I attended via phone ) was fun and interesting, which is a little of a surprise. They had a group of developers perhaps larger than our own, and they were ready to roll! We outlined a couple of ways to proceed:
- They provide Stanford Registry CourseClass compliant XML documents which we would load into the CM system. After that they use the self-serve model we use with our current client base
- They use WebServices to build / clone sites and manage the roster directly
- They use Coursework/Sakai in a fully manual mode.
Then I'm turning them loose and encouraging option B. Their development crew seems eager and capable, so in this case I decided that they could learn by doing. The WSDLs will give them a rough framework, and I've built some sample Sakai sites they can compare against. They'll hunt me down via a mailing list I set up for our collaboration, and when we get to a deeper stage in the partnership we'll open up our bugzilla instance to their QA/Developer team.
why this is going to work
In this case EPGY has a working, accepted example of Sakai; they don't need to do a full assessment analysis. Because of this they aren't asking for Sakai design documents, or database data dictionaries. They trust that we've done that analysis already. This trust releases them to act as adult learners: example and experiment.
Contrast the EPGY team with those organizations who don't have trusted peers with Sakai experience. Those folks looking to adopt Sakai and who do want to do a fuller assessment analysis have to commit to plowing through our collective ant-hill of self generated content and Javadoc. The cost of experimentation is pretty high by the time they've waded through enough to get their confidence level up.
Just a couple of links for me to look into later.
Our WebLoad license expired sometime in the last year. The QA team wasn't really driving the use of automated testing ( the eternal fire drill problem ) and when it lapsed we were stuck. I think that Margaret did get some kind of grandfather clause and they've been poking at the newer work with it. I couldn't get them to use jMeter due to the lack of a GUI they grokked, and the need to tinker. ( I can understand that being in an eternal support firedrill leads one to not want to tinker! )
I've been wondering if Selenium would help a bit. Or if there is a way to lower the expressive boundary for writing testing scripts.
In between bobcat observations (oh, there goes another tree!) I ran across drools. That looks like what we were doing 15 years ago at IntelliCorp - rule systems and objects. This is another Rete algorithim implementation, but could be fun. It has a groovy impl too; cool. I left as the SAP cloud descended on IntelliCorp but the work done there continues to influence me today. Heck an old customer looked me up a few weeks ago to reminisce about rule systems and nuclear power plants. ( do not attempt that at home. )
Now to see why our partners in the Registry team took 2 weeks to tell us they couldn't access our SVN repository and what's our problem. Up the management tree to boot. The problem is that they assumed we had installed the web front end, which we didn't. Hello? Just ask!