I had a great experience at the eLearning Guild’s Learning Solutions Conference last week. The days were long, but the time was really valuable. My own session on Avoiding Voice Over Script Pitfalls went very well. I had a very active, engaged audience. We even had a voice over artist and an editor attending, which was perfect for my session. I’ve had some requests to give a virtual version of my session, so stay tuned for that.
It was so much fun to get to meet people in person who I’d only met online. I’ve built so many relationships with people online, but it’s great to see them live and connect a different way.
I took about 30 pages of notes over the 3 days. While everything is still fresh in my mind, I want to record some highlights. This list is one thing I can use from every session I attended. This isn’t necessarily the most important point from the speaker; in fact, some of these came from tangents. I’m focusing on what I think I can apply in my own work.
Information on all sessions can be found on the LS Con website.
Tim Gunn: A Natty Approach to Learning and Education
“There’s nothing more powerful than the word no.” Gunn talked about this in terms of advocating for the intellectual property rights for designers, but I think this applies to working with clients and SMEs as well.
Connie Malamed: Crash Course in Information Graphics for Learning
I loved the ideas for data visualization from this presentation. I don’t do infographics often, but I do need to present data and charts in courses (including one of my current projects). My big takeaway is that I need to do more sketching for charts. I’ve started to do more pencil and paper sketching for course layouts thanks to Connie’s last book, but visual brainstorming for charts would be helpful too.
Mark Sheppard: Building a Learning and Social-Collaborative Ecosystem in Slack
One note is that Learning Locker is working with xAPIs that can talk to Slack and pull data. Even without xAPI, you can get other data from Slack, like how many emojis were used to answer a poll.
Julie Dirksen: Diagnosing Behavior Change Problems
How many times has a client or SME given you a vague objective like, “Improve customer service”? That’s a nice business goal, but what does that mean for measurable performance? What behavior do you want to change? Julie shared her “photo test” for identifying behaviors. What would that behavior look like if you took a photo or video of it? Asking that question can help get to an observable behavior you can measure.
Karen Kostrinsky: Storytelling for Learning Impact
Think about the titles for your courses. What’s the most important takeaway? How can you put that takeaway in the title?
This session also had some discussion around the difference between scenarios and stories. Some people raised objections to using stories. I’m planning some future blog posts around those objections and questions.
Glen Keane: Harnessing Creativity in a Time of Technological Change
My favorite quote (I’ve already used it with a client during a video call): “Technology is like a 3-year-old at a formal dinner. Just when you want it to be at its best behavior, it starts acting up.” On a more serious note, Keane talked about how creativity means he can see it in his head, but he has to figure out how to get you to see it too. That’s a challenge we face creating elearning. We can see it in our heads (or the SMEs can see it in their heads), but we have to get it in a format learners can use.
Jane Bozarth: New Ideas for Using Social Tools for Learning
Jane shared lots of inspiration in this session (who knew that the TSA had a great Instagram account?). What I’m going to use first is a Pinterest board for sharing book lists. I started a draft version, but I want to switch the order (I forgot to load them backwards) and move this to a professional account rather than my personal one.
Jennifer Hofmann: Mindsets, Toolsets, and Skillsets for Modern Blended Learning
One quote stood out: “If you can test it online, you can teach it online.” When you think about blended learning, think about goals and objectives first, then assessment. Decide on the instructional strategy, technique, and technology after you figure out the assessment. Maybe some parts of the skill can’t be taught and assessed online, but think about the parts that can be.
Will Thalheimer: Neuroscience and Learning: What the Research Really Says
The big takeaway is that we should be skeptical of claims that we can use neuroscience to improve learning. The reality is that we don’t know enough about neuroscience to really improve learning design yet. Sometimes what people claim is neuroscience (which means fMRI data) is actually earlier cognitive psychology research with an incorrect label.
Panel: What’s Wrong with Evaluation?
This was with Will Thalheimer, Julie Dirksen, Megan Torrance, and Steve Forman, with JD Dillon moderating. Can’t you tell from just the list of names that this was a good discussion?
Julie Dirksen made the point that we as instructional designers don’t get enough feedback on our own work. We don’t really know whether what we’re doing is working or not. It takes 10,000 hours (give or take) to become an expert, but that only works if you get the right kind of feedback to continuously improve your practice.
On a related note, Megan Torrance asked, “Why don’t we do A/B testing on training?” I saw an example of that at the DemoFest, but I admit I’ve never done it myself. Maybe there’s a way to set that up for a future project so I can test what method really works (and get feedback for my own practice in the process).
Jennifer Nilsson: Best Practices Training Should Steal from Software Development
We talk a lot about stealing agile methods from software development, but Jennifer’s presentation was about other proven practices. For example, software developers add comments to their code to explain what something does and why it was done a certain way. We can’t always add comments to our development tools the way you can in true coding, but we can add notes in an off screen text box. That’s an easy solution that will save a lot of time if I have to go back to a complicated interaction a year later.
Diane Elkins: Responsive Design: A Comparison of Popular Authoring Tools
The first thing I’m going to change as a result of this session is what questions I ask clients after they say they want a mobile solution. I haven’t been asking enough follow up questions to understand what clients really mean by “responsive.” Do they mean tablets only? Are they OK with landscape only for phones? Is a scalable solution enough, or do they really want it fully responsive (adaptive)?
Julia Galef: Embracing a Mindset of Continuous Learning
We all use motivated reasoning sometimes and ignore evidence that doesn’t support the outcome we want. One way to check if you’re vulnerable to self-deception on a specific topic is the “button test.” Imagine you could press a button and find out the absolute, complete truth about something. If you find yourself hesitating to push that button, you might be vulnerable to motivated reasoning on that topic. If you know that, you can be aware of your cognitive biases and be more careful.
Photos
I took photos during the sessions and of the lovely sketchnotes taken for many sessions (including sessions I didn’t attend). Email readers, you may need to click through to the post to see the gallery of images.

0 thoughts on “What I Learned at LSCon

Leave a Reply