Player FM - Internet Radio Done Right
Checked 5d ago
Adăugat seven ani în urmă
Content provided by Scriptorium - The Content Strategy Experts. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Scriptorium - The Content Strategy Experts or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !
Treceți offline cu aplicația Player FM !
Content Operations
Marcați toate (ne)redate ...
Manage series 2320086
Content provided by Scriptorium - The Content Strategy Experts. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Scriptorium - The Content Strategy Experts or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Scriptoriums delivers industry-leading insights for global content operations.
…
continue reading
193 episoade
Marcați toate (ne)redate ...
Manage series 2320086
Content provided by Scriptorium - The Content Strategy Experts. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Scriptorium - The Content Strategy Experts or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Scriptoriums delivers industry-leading insights for global content operations.
…
continue reading
193 episoade
Все серии
×C
Content Operations

1 Building your futureproof taxonomy for learning content (podcast, part 2) 22:12
22:12
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut22:12
In our last episode , you learned how a taxonomy helps you simplify search, create consistency, and deliver personalized learning experiences at scale. In part two of this two-part series, Gretyl Kinsey and Allison Beatty discuss how to start developing your futureproof taxonomy from assessing your content needs to lessons learned from past projects. Gretyl Kinsey: The ultimate end goal of a taxonomy is to make information easier to find, particularly for your user base because that’s who you’re creating this content for. With learning material, the learner is who you’re creating your courses for. Make sure to keep that end goal in mind when you’re building your taxonomy. Related links: Taxonomy: Simplify search, create consistency, and more (podcast, part 1) The challenges of structured learning content (podcast) DITA and learning content Metadata and taxonomy in your spice rack Transform L&D experiences at scale with structured learning content LinkedIn: Gretyl Kinsey Allison Beatty Transcript: Introduction with ambient background music Christine Cuellar: From Scriptorium, this is Content Operations, a show that delivers industry-leading insights for global organizations. Bill Swallow: In the end, you have a unified experience so that people aren’t relearning how to engage with your content in every context you produce it. Sarah O’Keefe: Change is perceived as being risky, you have to convince me that making the change is less risky than not making the change. Alan Pringle: And at some point, you are going to have tools, technology, and process that no longer support your needs, so if you think about that ahead of time, you’re going to be much better off. End of introduction Allison Beatty: I am Allison Beatty. Gretyl Kinsey: I’m Gretyl Kinsey. AB: And in this episode, Gretyl and I continue our discussion about taxonomy. GK: This is part two of a two-part podcast. AB: So if you don’t have a taxonomy for your learning content, but you know need one, what are some things to keep in mind about developing one? GK: Yeah, so there are all kinds of interesting lessons we’ve learned along the way from working with organizations who don’t have a taxonomy and need one. And I want to talk about some of the high-level things to keep in mind, and then we can dive in and think about some examples there. One thing I also want to just say upfront is that it is very common for learning content in particular to be developed in unstructured environments and tools like Microsoft Word or Excel. It’s also really common that if you are working within a learning management system or LMS for there to be a lack of overall consistency because the trade-off there is you want flexibility, right? You want to be able to design your courses in whatever way is best suited for that specific subject or that set of material. But that’s where you do have that trade-off between how consistent is the information and the way it’s organized versus how flexible is it to give your instructional designers that maximum creativity. And so when you’ve got those kinds of considerations, then that can make the information harder for your students to find or to use and even for your content creators. So we’ve seen organizations where they’ve said, “We’ve got all of our learning materials stuck in hundreds of different Word files or spreadsheets or in sometimes different LMS’ or sometimes different areas in the same LMS.” And when they have all of those contributors, like we talked about with multiple authors contributing, or sometimes lots and lots of subject matter experts part-time contributing, that really creates these siloed environments where you’ve got different little pieces of learning material all over the place and no one overarching organizational system. And so that’s typically the driving point that see where that organization will say, “We don’t have a taxonomy. We know that we need one.” But I think that is the first consideration is if you don’t have one and you know you need one, the first question to ask is why? Because so often it is those pain points that I mentioned, that lack of one cohesive system, one cohesive organization for your content, and sometimes also one cohesive repository or storage mechanism. So that’s typically where you’ll have an organization saying, “We don’t have a good way to kind of connect all of our content and have that interoperability that you were talking about earlier, and we need some kind of a taxonomy so that even if we do still have it created in a whole bunch of different ways by a bunch of different people, that when it gets served to the students who are going to be taking these courses, it’s consistent, it’s well-organized, it’s easy for people to find what they need.” So I think that’s the first consideration is that if you’ve got that demand for taxonomy developing, think about where that’s coming from and then use that as the starting point to actually create your taxonomy. And then I think one other thing that can help is to think about how your content is created. So if you do have those disparate environments or you’ve got a lot of unstructured material, then take that into account and think about building a taxonomy in a way that’s going to benefit rather than hinder your creation process. And that is especially important the more people that you have contributing to your learning material. It’s really helpful to try to gather information and metrics from all of your authors and contributors, as well as from your learners. So any kind of a feedback form that, if you’ve got some kind of an e-learning or training website where you can assess information that your learners tell you about, what was good or bad about the experience, what was difficult or what would make their lives easier, that’s really great information for you to have. But also from your contributors, your authors, your subject matter experts, your instructional designers, if they have a way to collect feedback or information on a regular basis that will help enhance the next round of course design, then all of that can contribute to taxonomy creation as well. When you start building a taxonomy from the ground up, you can look at all the metrics that you’ve been collecting and say, “Here’s what people are searching for. We should make sure that we have some categories that reflect that. Here are difficulties that our authors are encountering with being able to find certain information and keep it up to date or with being able to associate things with learning objectives. So let’s build out categories for that.” So really making sure that you use those metrics. And if you’re not collecting them already, it’s never too late to start. I think the biggest thing to keep in mind also is to plan ahead very carefully and to make sure that you’re thinking about the future, that you’re doing futureproofing before you actually build and implement your taxonomy. And I know we both can probably speak to examples of how that’s been done well versus not so well. AB: Yeah, maintenance is so important. GK: Yeah, and I think the more that you think about it upfront before you ever build or put a taxonomy in place, the easier that maintenance is going to be, right? Because we’ve seen a lot of situations where an organization will just start with a taxonomy, but maybe it’s not broad enough. So maybe it only starts in one department. Like they have it for just the technical docs, but they don’t have it for the learning material. And then down the road it’s a lot more difficult to go in and have to rework that taxonomy for new information that came out of the learning department. That if they had had that upfront, it could have served both training and technical docs at the same time. So thinking about that and doing that planning is one of the best ways to avoid having to do rework on a taxonomy. AB: And I’m glad you brought up the gathering of feedback and insight from users before diving into building out a taxonomy. Because at the end of the day, you want it to be usable to the people who need that classification system. That is the most important part. GK: Yeah, that’s absolutely the end goal. AB: Usability. GK: Yeah, and I think a big part of that, like I’ve mentioned, planning ahead carefully and futureproofing, is looking at metrics that you’ve gathered over time because that can help you to see whether something in those metrics or in that feedback is a one-off fluke or whether it’s an ongoing persistent trend or something that you need to always take into consideration from your end users. If you’ve got a lot of people saying the same things, a lot of people using the same search terms over time, that can really help you with your planning. And yeah, like you said, I think the ultimate end goal of a taxonomy is to make information easier to find, and in particular for your user base because that’s who you’re creating this content for. And with learning material, that’s who you’re creating your courses for. So you want to make sure that when you’re building that taxonomy, that that end goal is something you always keep in mind. How can we make this content easier for people to find and to use? AB: Definitely. Something else that I am curious to get your take on is in this planning stage. So in my experience, I feel like there’s never nothing to start with. Even if there’s not any formalized standards or anything around classification of content, there’s like a colloquial system, right? GK: Yes, very much so. AB: Of how content creators or users think about an organized content, even if they’re not necessarily using a taxonomy. GK: Yeah. A lot of times it’s very similar to when we just talked about content structure itself. That if you’re in something like Microsoft Word or Unstructured FrameMaker, even if there’s not an underlying structure, a set of tags under that content, there is still an implied structure. You can still look at something like a Word document and say, “Okay, it’s got headings at these various levels. It’s got paragraphs. It’s got notes,” and you can glean a structure from that even though that structure does not exist in a designated form, right? So taxonomy is the same way. You’ve got people using information and categorizing information, even if they don’t have formal categories or a written down or tagged taxonomy structure. There’s always still some a way that people are organizing that material so that they can find it as authors or so that their end users can find it as the audience. And so that’s also a really good place to draw from. If you don’t have that formal taxonomy in place, you do still have an implied taxonomy somewhere. And so that’s where, going back to what you said about gathering the metrics, that’s a lot of times how you can find it and start to root it out if you are looking for that starting point of here’s how we need to build this formal taxonomy. So I think that’s step one is after you’ve figured out why you need to have that formal taxonomy in place, what’s the driving factor behind it? Then start going and hunting down that information about your existing implied taxonomy and how people are currently finding and categorizing information, because that will help you to at least start drafting something. And then you can further plan and refine it as you take into account the various metrics from your user base, and then gather information across all the different content producing departments in your organization until you finally settle on what that taxonomy structure should look like. AB: I know that the word taxonomy can sound complicated and scary and all that, but you’re never really starting with the fear of a blank page. Taxonomies are everywhere and in everything, even if they’re not formalized. Think about when you go to the grocery store and you know you need ketchup and you’re going to go to the condiment aisle to find that. There’s so much organization and hierarchy just in our day-to-day lives that exist already. That’s never a fear of a blank page with taxonomies. There’s just thinking of the future and being mindful that things may change and maintenance will happen. GK: Exactly. I think that point that you made about even when you go to the grocery store, humans think in taxonomy, right? Humans naturally categorize things. AB: And group things. Yeah. GK: And so I think the main goal of having a taxonomy formalized is to take that out of people’s heads and actually get it into a real form that multiple people can all use together, and then that serves that ultimate end goal we talked about of making things easier for your users to find. AB: Access. Definitely. I want to talk about some lessons learned based on taxonomies that you and I have worked with clients, and I’m thinking of how you’re never starting with a blank page. I’m thinking about one project in particular where we developed a learning content model and used Bloom’s Taxonomy as a jumping-off point for this learning model. That’s another option or another way to go about it is use the implied structure in combination with a structure that already exists and integrating that into your content model. And then on the other hand, I know we’ve also done taxonomies for learning where we’ve specialized a lot. GK: And specialization is always interesting because we see that develop out of… If you are putting out information that is very specific, so for example, if you are putting out learning material or courses around… I’ll go back to the example from earlier. Here’s how to use this specific kind of software. Here’s a class that you can take to get certified for doing this kind of an activity and this kind of software. Then that’s when it makes sense to think about any kind of specialized structures that you might want to have that are specific to that software. And it can be the same in whatever kind of material that you’re presenting. If you’re saying, “Oh, we’re in the healthcare industry. We are in the finance industry. We’re in the technology industry,” whatever your industry is, there’s going to be specific information to that industry that you probably want to capture as part of your taxonomy. Those categories are going to be specific to that industry and to the product or material that you are producing or to the learning material, the courses that you’re creating. So that’s a really good thing to think about when it comes to that taxonomy development is if we are in any very specific industry where we need that industry-specific information in the taxonomy, then it’s going to be really important to specialize. And so if you’re working in DITA XML, specialization is creating custom elements from out of the box or existing ones or standard ones. And so whenever you think about a taxonomy that is driven by metadata in DITA XML, then that’s where you might start creating some custom metadata elements and attributes that can drive your taxonomy. And those custom names for those elements and attributes would be something that you do specialize in and that matches the requirements or the demands of your industry. AB: Yeah, that’s spot on with the example I was talking about a while ago about how the Library of Congress uses Library of Congress subject headings, but the National Library of Medicine has their own classification system for cataloging. But under the hood, they’re both Dublin Core. They’re both specialized Dublin Core. You know what I mean? GK: Yes. AB: There’s different context and then… Yeah, totally. Oh, this was the question I was going to ask you. Is there a trade-off with heavy specialization in your taxonomy? GK: I think the biggest trade-off is maintenance. So we were talking earlier about how when you’re doing that initial planning that you want to think about futureproofing and you want to think about how you can make it as easy to maintain as possible within reason, of course, because nothing is ever easy when it comes to content development. AB: That’s true. GK: But yeah, when it comes to heavy specialization, that’s the biggest thing to consider is that for any kind of specialized tagging, you have to have specialized knowledge, so people who understand the categories, who know how to build that specialization and how to maintain it. So you have to have those resources available, and you also have to think about when you need to inevitably add or change the material, how much more difficult is that going to be if you specialize tags. Maybe it’s going to actually enhance things. And so instead of making things more difficult, it might be a little bit easier if you are specializing because then you already have created custom categories before. And if you need to add one down the road, you’ve got a roadmap for that. But it really depends on your organization and the resources that you have available. And thinking specifically about learning content as well, I think one of the biggest areas where heavy specialization can be challenging is that it is typical to have so many part-time contributors and subject matter experts who are not going to be experts in the tagging system. They’re just going to be experts in the actual material that they’re contributing. And so if they have to learn how to use those tags to a certain extent, then sometimes the more customization or specialization that you do, the more difficult that can be for those contributors, and it can make it sometimes difficult to get them on board with having that taxonomy in the first place. AB: Yeah, change management. GK: So I think that’s the big trade-off. Yes, change management, maintenance, and thinking about the best balance for making sure that things are useful for your organization. That you’ve got the taxonomy in place that you need, but it’s also not going to be so difficult to maintain that it essentially fails and that your authors and contributors don’t want to keep it going. AB: This is a big question, but who’s responsible for maintaining a taxonomy within an organization that develops learning content site. GK: So I think there’s a difference here between who is responsible and who should be responsible. AB: Oh, that’s so true. GK: If we think about best practice, it really should just be I would say generally a small team who is designated for that role, who has an administrative role so that they can be in charge of governance over that taxonomy. Because if you don’t have that, if you don’t have the best practice or the optimal situation, then instead, what can happen is that either no one’s managing the taxonomy, which is obviously bad, because then it can just continue to spiral out of control, or it’s almost like a too many cooks in the kitchen a situation, where if you don’t have that designated leadership or governance role over taxonomy, and anyone can update it or make changes to it, then it loses all of its meaning, all of its consistency. I do think it’s important that it’s a small team and not one single person. Because if that person is sick or something, then you’re left high and dry. So you want to make sure you’ve got it’s a small enough team that it’s not going to have the too many cooks in the kitchen problem, but it’s also not just one person. AB: Another reason that it’s not ideal to have just one person is diversity prevents bias in your taxonomy, right? GK: Absolutely. AB: If one person has a confirmation bias about a specific facet and they document it or build something that way, but no one in the organization… You know what I mean? GK: Yeah. So that’s where that small team can provide checks and balances too. AB: Totally. GK: You can have things set up where maybe every person on that team has to approve changes that are made to the taxonomy, or when they’re initially designing it, they all are giving the final review and final approval on it, so that way you’re not having it just through one person and whatever biases that person might carry. AB: And biases isn’t necessarily a negative connotation, but just that people see the world differently from person to person. And by world, I do mean learning content sometimes. Is there anything else that you wanted to cover? GK: I think I just want to wrap things up by saying the big things to keep in mind, the main points that we talked about when you’re developing a taxonomy, whether it is for learning content or just more broadly, are to plan ahead, think ahead, do all of the planning upfront that you can, rather than just building things, so that that way you can avoid rework. Use the metrics of the information that you’ve gathered from both inside your organization and from your user base. And finally, keep that end goal in mind that this is all about making things easier for people to use, for people to find content and develop your taxonomy with that end goal in mind. AB: Yeah, I agree with all of that. Well, thanks so much for talking with me, Gretyl. GK: Of course. Thank you, Allison, for talking with me. Outro with ambient background music Christine Cuellar: Thank you for listening to Content Operations by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. Behind every successful taxonomy stands an enterprise content strategy Building an effective content strategy is no small task. The latest edition of our book, Content Transformation is your guidebook for getting started. var gform;gform||(document.addEventListener("gform_main_scripts_loaded",function(){gform.scriptsLoaded=!0}),document.addEventListener("gform/theme/scripts_loaded",function(){gform.themeScriptsLoaded=!0}),window.addEventListener("DOMContentLoaded",function(){gform.domLoaded=!0}),gform={domLoaded:!1,scriptsLoaded:!1,themeScriptsLoaded:!1,isFormEditor:()=>"function"==typeof InitializeEditor,callIfLoaded:function(o){return!(!gform.domLoaded||!gform.scriptsLoaded||!gform.themeScriptsLoaded&&!gform.isFormEditor()||(gform.isFormEditor()&&console.warn("The use of gform.initializeOnLoaded() is deprecated in the form editor context and will be removed in Gravity Forms 3.1."),o(),0))},initializeOnLoaded:function(o){gform.callIfLoaded(o)||(document.addEventListener("gform_main_scripts_loaded",()=>{gform.scriptsLoaded=!0,gform.callIfLoaded(o)}),document.addEventListener("gform/theme/scripts_loaded",()=>{gform.themeScriptsLoaded=!0,gform.callIfLoaded(o)}),window.addEventListener("DOMContentLoaded",()=>{gform.domLoaded=!0,gform.callIfLoaded(o)}))},hooks:{action:{},filter:{}},addAction:function(o,r,e,t){gform.addHook("action",o,r,e,t)},addFilter:function(o,r,e,t){gform.addHook("filter",o,r,e,t)},doAction:function(o){gform.doHook("action",o,arguments)},applyFilters:function(o){return gform.doHook("filter",o,arguments)},removeAction:function(o,r){gform.removeHook("action",o,r)},removeFilter:function(o,r,e){gform.removeHook("filter",o,r,e)},addHook:function(o,r,e,t,n){null==gform.hooks[o][r]&&(gform.hooks[o][r]=[]);var d=gform.hooks[o][r];null==n&&(n=r+"_"+d.length),gform.hooks[o][r].push({tag:n,callable:e,priority:t=null==t?10:t})},doHook:function(r,o,e){var t;if(e=Array.prototype.slice.call(e,1),null!=gform.hooks[r][o]&&((o=gform.hooks[r][o]).sort(function(o,r){return o.priority-r.priority}),o.forEach(function(o){"function"!=typeof(t=o.callable)&&(t=window[t]),"action"==r?t.apply(null,e):e[0]=t.apply(null,e)})),"filter"==r)return e[0]},removeHook:function(o,r,t,n){var e;null!=gform.hooks[o][r]&&(e=(e=gform.hooks[o][r]).filter(function(o,r,e){return!!(null!=n&&n!=o.tag||null!=t&&t!=o.priority)}),gform.hooks[o][r]=e)}}); Name (Required) First Last Company name (Required) Email (Required) Enter Email Confirm Email Consent for storing personal data (Required) I consent to my personal data being stored according to the Scriptorium privacy policy. Consent to subscribe (Required) I consent to the use of my email address for occasional marketing emails according to the Scriptorium privacy policy. I understand that I can unsubscribe at any time. 21291 /* <![CDATA[ */ gform.initializeOnLoaded( function() {gformInitSpinner( 28, 'https://www.scriptorium.com/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery('#gform_ajax_frame_28').on('load',function(){var contents = jQuery(this).contents().find('*').html();var is_postback = contents.indexOf('GF_AJAX_POSTBACK') >= 0;if(!is_postback){return;}var form_content = jQuery(this).contents().find('#gform_wrapper_28');var is_confirmation = jQuery(this).contents().find('#gform_confirmation_wrapper_28').length > 0;var is_redirect = contents.indexOf('gformRedirect(){') >= 0;var is_form = form_content.length > 0 && ! is_redirect && ! is_confirmation;var mt = parseInt(jQuery('html').css('margin-top'), 10) + parseInt(jQuery('body').css('margin-top'), 10) + 100;if(is_form){jQuery('#gform_wrapper_28').html(form_content.html());if(form_content.hasClass('gform_validation_error')){jQuery('#gform_wrapper_28').addClass('gform_validation_error');} else {jQuery('#gform_wrapper_28').removeClass('gform_validation_error');}setTimeout( function() { /* delay the scroll by 50 milliseconds to fix a bug in chrome */ }, 50 );if(window['gformInitDatepicker']) {gformInitDatepicker();}if(window['gformInitPriceFields']) {gformInitPriceFields();}var current_page = jQuery('#gform_source_page_number_28').val();gformInitSpinner( 28, 'https://www.scriptorium.com/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery(document).trigger('gform_page_loaded', [28, current_page]);window['gf_submitting_28'] = false;}else if(!is_redirect){var confirmation_content = jQuery(this).contents().find('.GF_AJAX_POSTBACK').html();if(!confirmation_content){confirmation_content = contents;}jQuery('#gform_wrapper_28').replaceWith(confirmation_content);jQuery(document).trigger('gform_confirmation_loaded', [28]);window['gf_submitting_28'] = false;wp.a11y.speak(jQuery('#gform_confirmation_message_28').text());}else{jQuery('#gform_28').append(contents);if(window['gformRedirect']) {gformRedirect();}}jQuery(document).trigger("gform_pre_post_render", [{ formId: "28", currentPage: "current_page", abort: function() { this.preventDefault(); } }]); if (event && event.defaultPrevented) { return; } const gformWrapperDiv = document.getElementById( "gform_wrapper_28" ); if ( gformWrapperDiv ) { const visibilitySpan = document.createElement( "span" ); visibilitySpan.id = "gform_visibility_test_28"; gformWrapperDiv.insertAdjacentElement( "afterend", visibilitySpan ); } const visibilityTestDiv = document.getElementById( "gform_visibility_test_28" ); let postRenderFired = false; function triggerPostRender() { if ( postRenderFired ) { return; } postRenderFired = true; jQuery( document ).trigger( 'gform_post_render', [28, current_page] ); gform.utils.trigger( { event: 'gform/postRender', native: false, data: { formId: 28, currentPage: current_page } } ); gform.utils.trigger( { event: 'gform/post_render', native: false, data: { formId: 28, currentPage: current_page } } ); if ( visibilityTestDiv ) { visibilityTestDiv.parentNode.removeChild( visibilityTestDiv ); } } function debounce( func, wait, immediate ) { var timeout; return function() { var context = this, args = arguments; var later = function() { timeout = null; if ( !immediate ) func.apply( context, args ); }; var callNow = immediate && !timeout; clearTimeout( timeout ); timeout = setTimeout( later, wait ); if ( callNow ) func.apply( context, args ); }; } const debouncedTriggerPostRender = debounce( function() { triggerPostRender(); }, 200 ); if ( visibilityTestDiv && visibilityTestDiv.offsetParent === null ) { const observer = new MutationObserver( ( mutations ) => { mutations.forEach( ( mutation ) => { if ( mutation.type === 'attributes' && visibilityTestDiv.offsetParent !== null ) { debouncedTriggerPostRender(); observer.disconnect(); } }); }); observer.observe( document.body, { attributes: true, childList: false, subtree: true, attributeFilter: [ 'style', 'class' ], }); } else { triggerPostRender(); } } );} ); /* ]]> */ The post Building your futureproof taxonomy for learning content (podcast, part 2) appeared first on Scriptorium .…
C
Content Operations

1 Taxonomy: Simplify search, create consistency, and more (podcast, part 1) 22:56
22:56
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut22:56
Can your learners find critical content when they need it? How do you deliver personalized learning experiences at scale? A learning content taxonomy might be your solution! In part one of this two-part series, Gretyl Kinsey and Allison Beatty share what a taxonomy is, the nuances of taxonomies for learning content, and how a taxonomy supports improved learner experiences in self-paced e-learning environments, instructor-led training, and more. Allison Beatty: I know we’ve made taxonomies through all sorts of different frames, whether it’s structuring learning content, or we’ve made product taxonomies. It’s really a very flexible and useful thing to be able to implement in your organization. Gretyl Kinsey: And it not only helps with that user experience for things like learning objectives, but it can also help your learners find the right courses to take. If you have some information in your taxonomy that’s designed to narrow it down to a learner saying, “I need to learn about this specific subject.” And that could have several layers of hierarchy to it. It could also help your learners understand what to go back and review based on the learning objectives. It can help them make some decisions around how they need to take a course. Related links: The challenges of structured learning content (podcast) DITA and learning content Metadata and taxonomy in your spice rack Transform L&D experiences at scale with structured learning content Rise of the learning content ecosystem with Phylise Banner (podcast) LinkedIn: Gretyl Kinsey Allison Beatty Transcript: Introduction with ambient background music Christine Cuellar: From Scriptorium, this is Content Operations, a show that delivers industry-leading insights for global organizations. Bill Swallow: In the end, you have a unified experience so that people aren’t relearning how to engage with your content in every context you produce it. Sarah O’Keefe: Change is perceived as being risky, you have to convince me that making the change is less risky than not making the change. Alan Pringle: And at some point, you are going to have tools, technology, and process that no longer support your needs, so if you think about that ahead of time, you’re going to be much better off. End of introduction Gretyl Kinsey: Hello and welcome. I’m Gretyl Kinsey. Allison Beatty: And I’m Allison Beatty. GK: And in this episode, we’re going to be talking about taxonomy, particularly for learning content. This is part one of a two-part podcast. AB: So first things first, Gretyl, what is a taxonomy? GK: Sure. A taxonomy is essentially just a system for putting things into categories. Whether that is something concrete like physical objects or whether it’s just information. A taxonomy is going to help you collect all of that into specific categories that help people find what they’re looking for. And if you’ve ever been shopping before, you have encountered a taxonomy. So I like to think about online shopping, in particular, to explain this because you’ve got categories for the type of item that you’re buying at a broad level that might look something like you’ve got clothing, household goods, electronics, maybe food. And then within that you also have more specific categories. So if we start with clothing, you typically will have categories for things like the type of garment. So whether you are looking for shirts, pants, skirts, coats, shoes, whatever. And then you also might have categories for the size, for the color, for the material. They’re typically categories for the intended audience. So whether it’s for adults or kids. And then within that may be for gender. So all these different ways that you can sort and filter through the massive number of clothing results that you would get if you just go to a store and look at clothing. You’ve got all of these different pieces of information, these categories that come from a taxonomy where you can narrow it down. And that typically looks like things on a website, like search boxes, checkboxes, drop-down menus, and those contain the assets or the pieces of information from that taxonomy that are used to categorize that clothing. So then you can go in and check off exactly what you’re looking for and narrow down those results to the specific garment that you were trying to find. So the ability to go on a website and do all of that is supported by an underlying taxonomy. AB: So that’s an example of online shopping. I’m sure a lot of people are familiar with taxonomies in the sense of biology, but how can taxonomies be applied to content? GK: Sure. So we talk about taxonomy in terms of content for how it can be used to find the information that you need. So when you think about that online shopping example, instead of looking for a physical product like clothing. When it comes to content, you’re just looking for specific information. So it’s kind of like the content itself is the product. So if you are an organization that produces any kind of content, you can put a taxonomy in place so that your users can search through that content. They can sort and filter the results that they get according to those categories and your taxonomy. And that way they can narrow it down to the exact piece of information that they’re looking for instead of having to skim through a long website with a lot of pages, or especially if you’re dealing with any kind of manuals or books or more publications that you’re delivering. Not forcing them to read through all of that instead of being able to search and find exactly what they’re looking for. So some of the ways that taxonomies can help you categorize your content would be things like what type of information it is. So whether it is more of a piece of technical documentation, something like a user manual or a quick start guide or a data sheet, or whether it is marketing material, training material. You could put that as one of the categories in your taxonomy. You could also put a lot of information about your intended audience. So that could be things like their experience level. It could be things like the regions they live in or the languages they speak. Anything about that audience that’s going to help you serve up the content that those particular people need. It can also be things like what platform your audience uses or what platform is relevant for the material that you’re producing. It can be things like the product or product line that your content is documenting. There are all kinds of different ways that you can categorize that information. And I know that both of us have a lot of experience with putting these kinds of things together. So I don’t know if you’ve got any examples that you can think of for how you’ve seen information get categorized. AB: So a lot of the way I think about taxonomies is a library classification system or MARC records so in the same way that if you wanted to find a particular information resource and you went to your library’s online catalog and could filter down to something that fits your needs. You can think of treating your organization’s body of content like a corpus of information that you can further refine and assign metadata values to. Or in the case of a taxonomy hierarchy in the clothing example, choosing that you want a shirt would be a step above choosing that you want a tank top or a long sleeve shirt or a blouse. So a lot of my mindset around taxonomies for content is framed like libraries. The Library of Congress subject headings are generally a good starting off point for a library. But sometimes if your library has specific information needs, like the National Health Library has its own subject scheme that is further specialized than the broader categories that you get in Library of Congress subject headings, because they know that everything in that corpus is going to be health or medicine related information. And in the same way you and I have developed taxonomies for clients that are particular to their needs, you’re never going to start off knowing nothing when you build a taxonomy, right? GK: Exactly. And with the example that you were talking about of kind of looking at information in a library catalog, we see that with a lot of documentation. So if you’re thinking about technical content and things like product documentation, user guides, user manuals, we see that similar kind of functionality. If you have that content available through a website or an app or some other kind of digital online experience, back to the online shopping example. Your user base can in all of those different cases, go to those facets and filters, those check boxes, drop down menus, search boxes, and start narrowing down the information to what exactly they’re looking for. So that really helps to enhance the user experience to have that taxonomy in place underlying the information and making it easier to narrow down. I’ve also seen it really helpful on the authoring side. So if you have a large body of content, maybe you have it in something like a content management system. And more content that you have, the harder it becomes to find the specific information that you’re looking for. In particular, we deal with a lot of DITA XML. And so there will be a component content management system that that’s typically housed in. And when you’ve got it in there, those systems typically have some kind of underlying taxonomy in place as well that can capture all kinds of information about how and when the content was created. So that can help you find it. And then of course, you could have your own taxonomy for the kinds of things I named earlier, what type of information it is, what the intended audience is in case that can help you as the author find and narrow down something in your system. And it can also help you as an author to put together collections of content for personalized delivery. So maybe you have a general version of your user guide, but then you’ve also got audience specific versions that you can kind of filter and narrow it down to based on the metadata in your content. And that’s all going to be informed by those categories in your taxonomy. So really leveraging any of the information that you have about your audience, about how they use your content or how they need to use your content is really going to help you deliver it in a more flexible way and in a more efficient way as well. AB: I know for me personally, sometimes the amount of information out in the world can get very overwhelming. GK: Absolutely. AB: So I’m thinking about our LearningDITA e-learning project, and how much content we’ve collected between different versions of it and over the amount of time it’s been up, and it makes it so much easier to navigate knowing where pieces of content are when I’m looking for something as an author on that project. GK: And that actually brings up a really good point because we were talking about the taxonomies used in content. We were primarily talking about technical content, so things like product documentation, user guides, legal, regulatory, but it can also be used for other types of content. And learning content is a really big one, and we are seeing that more and more. AB: Absolutely. GK: There’s a lot of overlap at organizations between technical documentation and learning or training material, especially if you make a product where there are certifications. So we see a lot of times, for example, with people who make software. That organization will usually have the product documentation, here’s how you use this software. But then there’s also training material so that if there are certifications around the use of that software, then there’s that material where their user base can go take a class and essentially be students or learners in that context rather than just consumers of the product. And so there’s a lot of need to share information across the technical documentation and the learning material. And we see more and more organizations where the learning material is kind of their main product, looking for ways to better categorize that information and have a taxonomy underneath it. And so when you mentioned LearningDITA, that kind of got me thinking about how not only that useful for us as the creators of LearningDITA, but for all the other organizations that also produce learning material. How much a taxonomy helps that experience, not only for them as the authors, but also for their end users. AB: It’s a win-win for users and creators. Something I would like to discuss is self-guided e-learning, and how a taxonomy can make it easier to tie assessments to learning objectives in that sort of asynchronous setting as opposed to a more traditional classroom. GK: And e-learning is really interesting because there’s a lot of flexibility out there in terms of how you can present that information and how you can gather information from the students or the learners taking your e-learning courses. And we’ve seen different categories or taxonomies around gathering information or putting information on your learning material about things like the intended reading level or grade level if you’re dealing with students who are still in school. You could also put information about things like the industry. If your learner base is professionals, you can put information about the subject that you’re covering, the type of the certification associated with that material. And then like you mentioned, learning objectives. So typically with any kind of a course that’s put out there for students to take, whether it’s e-learning or whether it’s just in a classroom, there are specific learning objectives that that material is intended to cover. So whenever you as a student get to the end, it’s basically you should be able to understand this concept or perform this activity as a result of taking this course. And we have seen a lot of demand in various different industries for tying those learning objectives to the assessment questions. So if you’re in an e-learning course, you’ve got your kind of self-guided material where you’re walking through, you’re reading, maybe you’re doing some exercises, maybe you’re watching some videos or looking at some examples. And then at the end there’s some kind of a quiz or an assessment to test your knowledge. And with e-learning, that’s typically something where you’re entering answers, maybe you’re checking boxes for multiple choice questions, or you’re typing a response in, or you’re picking true faults, things like that. So you take that quiz and the questions in that quiz are tied back to those learning objectives from the beginning of the lesson. So that way if you get a question wrong, it can tell you this is the specific learning objective that you missed this question four, and that you should go back and review more material that’s associated with that learning objective. And having all of that tied together so that your e-learning environment can actually serve up that information is where it can really help to have a taxonomy underneath. When you think about it, learning objectives themselves kind of naturally fall into categories. And there are even standards when you think about things like Bloom’s taxonomy, that’s a typical standard that’s applied to learning material. And of course you could also come up with whatever categories that you want for your learning information, but those objectives are often tied directly to the categories. And then being able to have the structure in place to tie those objectives and the taxonomy categories that are associated with to your assessment questions to the rest of your material just makes the whole experience a lot more seamless and streamlined for your learners. AB: It’s so valuable, particularly learning objectives. I’m glad you brought up Bloom’s taxonomy because I think that’s a pretty familiar entry point to taxonomies for a lot of people who work in the learning space. And I’m kind of also thinking about whether it’s learning content or technical documentation, any implementation of a taxonomy for a body of digital content. It sort of turtles all the way down, whether it’s a learning objective that is the value or significance being assigned to a piece of content. If you think about information theory and how sort of the basis of what is a node and a taxonomy is it’s a discrete thing. And I know it drives people crazy. That thing is more or less the technical term in that situation. It sounds so vague, but the thing is, it’s a discrete object that has a purpose for why it exists, whether it’s a learning objective that’s tied as an attribute in your DITA or piece of metadata somewhere or elsewhere, or whether it’s technical documentation that’s telling you which product, a piece of content assigns to. I know we’ve made taxonomies through all sorts of different frames, whether it’s structuring learning content, or we’ve made product taxonomies. It’s really a very flexible and useful thing to be able to implement in your organization. GK: And it not only helps with that user experience for things like learning objectives, but it can also help your learners just find the right courses to take. So if you have some information in your taxonomy that’s designed to narrow it down to a learner saying, “I need to learn about this specific subject.” And that could have several, of course, layers of hierarchy to it. It could also help your learners to understand what to go back and review based on the learning objectives. It can help them to maybe make some decisions around how they want to take a course. So when you think about e-learning, you can have it be self-guided and asynchronous, or sometimes it could be instructor-led. And so if you’ve got something like that baked into your taxonomy, something about the method of delivery that could help your learners decide which mechanism is going to be better for them. So all of that can be really helpful. And I also want to talk about it again from going back to the creator side, just like we did with technical content. Because if you are designing learning material, you’re an instructional designer, you’re putting together a course, then you might want some information about things like the learner’s progress, their understanding of the material. You’re going to want to obviously capture all the information around the scoring and grading from the assessments that they take. And having that tied back to a taxonomy, whether it’s to learning objectives or to any other information, can help you to understand how you might need to adjust the material. So if you notice, for example, that you’ve got one learning objective that everyone seems to struggle to understand, you’ve got a large percentage of your students missing the assessment questions associated with that learning objective, then maybe that tells you we need to go back and rewrite this or rework how it’s presented. So the taxonomy can not only help your learners find the information, navigate the courses, and take the courses that they need, but it can also help you to adjust the design of those courses in a way that further enhances their learning experience. AB: Absolutely. Something else that you just made me think of is say you have an environment of creating learning content with multiple authors. Another advantage of the taxonomy is that it can standardize metadata values. So say you and I, Gretyl are working within the same learning organization, and then when content that’s written by either one of us goes to publish, the metadata values will be standard if we use the same taxonomy. GK: And that’s also a really important point because that standardization is good not only across just a subset of your content, like your learning material, but we’ve seen some organizations go more broad and say, “Our learning content and our technical docs and our marketing material.” And whatever other content they have, all needs to have a consistent set of terminology. It needs to have a consistent set of categories that people use to search it. And so you can think about taxonomy at a broader level too, for all the information across the entire company or the entire organization, and make sure that it’s all going to fit into those categories consistently because it is, like you said, very typical to have lots of different people contributing to content creation. And then in particular, with learning content, we see a lot of subject matter experts and part-time contributors who do something else, but then they might write some assessment questions or they might write a lesson here and there. And having the ability to have that consistent categorization of information, consistent terminology, consistent application of metadata is really, really helpful when you’ve got so many different people contributing to the content because that helps to make sure that they’re not going to be introducing inconsistencies that confuse your end users. AB: That’s really a strength of most classification systems, whether it’s a controlled vocabulary or something more sophisticated like a taxonomy. And I’m thinking about something that you and I see a lot working with clients with DITA XML in particular is sort of blending technical and marketing content once DITA is implemented and having interoperability with your taxonomy definitely is a boon to that. GK: Absolutely. I think that’s a good place to wrap up for now. We’ll be continuing this discussion in the next podcast episode. So Allison, thank you. AB: Thank you. Outro with ambient background music Christine Cuellar: Thank you for listening to Content Operations by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. Behind every successful taxonomy stands an enterprise content strategy Building an effective content strategy is no small task. The latest edition of our book, Content Transformation is your guidebook for getting started. Name (Required) First Last Company name (Required) Email (Required) Enter Email Confirm Email Consent for storing personal data (Required) I consent to my personal data being stored according to the Scriptorium privacy policy. Consent to subscribe (Required) I consent to the use of my email address for occasional marketing emails according to the Scriptorium privacy policy. I understand that I can unsubscribe at any time. 25536 /* <![CDATA[ */ gform.initializeOnLoaded( function() {gformInitSpinner( 28, 'https://www.scriptorium.com/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery('#gform_ajax_frame_28').on('load',function(){var contents = jQuery(this).contents().find('*').html();var is_postback = contents.indexOf('GF_AJAX_POSTBACK') >= 0;if(!is_postback){return;}var form_content = jQuery(this).contents().find('#gform_wrapper_28');var is_confirmation = jQuery(this).contents().find('#gform_confirmation_wrapper_28').length > 0;var is_redirect = contents.indexOf('gformRedirect(){') >= 0;var is_form = form_content.length > 0 && ! is_redirect && ! is_confirmation;var mt = parseInt(jQuery('html').css('margin-top'), 10) + parseInt(jQuery('body').css('margin-top'), 10) + 100;if(is_form){jQuery('#gform_wrapper_28').html(form_content.html());if(form_content.hasClass('gform_validation_error')){jQuery('#gform_wrapper_28').addClass('gform_validation_error');} else {jQuery('#gform_wrapper_28').removeClass('gform_validation_error');}setTimeout( function() { /* delay the scroll by 50 milliseconds to fix a bug in chrome */ }, 50 );if(window['gformInitDatepicker']) {gformInitDatepicker();}if(window['gformInitPriceFields']) {gformInitPriceFields();}var current_page = jQuery('#gform_source_page_number_28').val();gformInitSpinner( 28, 'https://www.scriptorium.com/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery(document).trigger('gform_page_loaded', [28, current_page]);window['gf_submitting_28'] = false;}else if(!is_redirect){var confirmation_content = jQuery(this).contents().find('.GF_AJAX_POSTBACK').html();if(!confirmation_content){confirmation_content = contents;}jQuery('#gform_wrapper_28').replaceWith(confirmation_content);jQuery(document).trigger('gform_confirmation_loaded', [28]);window['gf_submitting_28'] = false;wp.a11y.speak(jQuery('#gform_confirmation_message_28').text());}else{jQuery('#gform_28').append(contents);if(window['gformRedirect']) {gformRedirect();}}jQuery(document).trigger("gform_pre_post_render", [{ formId: "28", currentPage: "current_page", abort: function() { this.preventDefault(); } }]); if (event && event.defaultPrevented) { return; } const gformWrapperDiv = document.getElementById( "gform_wrapper_28" ); if ( gformWrapperDiv ) { const visibilitySpan = document.createElement( "span" ); visibilitySpan.id = "gform_visibility_test_28"; gformWrapperDiv.insertAdjacentElement( "afterend", visibilitySpan ); } const visibilityTestDiv = document.getElementById( "gform_visibility_test_28" ); let postRenderFired = false; function triggerPostRender() { if ( postRenderFired ) { return; } postRenderFired = true; jQuery( document ).trigger( 'gform_post_render', [28, current_page] ); gform.utils.trigger( { event: 'gform/postRender', native: false, data: { formId: 28, currentPage: current_page } } ); gform.utils.trigger( { event: 'gform/post_render', native: false, data: { formId: 28, currentPage: current_page } } ); if ( visibilityTestDiv ) { visibilityTestDiv.parentNode.removeChild( visibilityTestDiv ); } } function debounce( func, wait, immediate ) { var timeout; return function() { var context = this, args = arguments; var later = function() { timeout = null; if ( !immediate ) func.apply( context, args ); }; var callNow = immediate && !timeout; clearTimeout( timeout ); timeout = setTimeout( later, wait ); if ( callNow ) func.apply( context, args ); }; } const debouncedTriggerPostRender = debounce( function() { triggerPostRender(); }, 200 ); if ( visibilityTestDiv && visibilityTestDiv.offsetParent === null ) { const observer = new MutationObserver( ( mutations ) => { mutations.forEach( ( mutation ) => { if ( mutation.type === 'attributes' && visibilityTestDiv.offsetParent !== null ) { debouncedTriggerPostRender(); observer.disconnect(); } }); }); observer.observe( document.body, { attributes: true, childList: false, subtree: true, attributeFilter: [ 'style', 'class' ], }); } else { triggerPostRender(); } } );} ); /* ]]> */ The post Taxonomy: Simplify search, create consistency, and more (podcast, part 1) appeared first on Scriptorium .…
C
Content Operations

1 Transform L&D experiences at scale with structured learning content 20:27
20:27
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut20:27
Ready to deliver consistent and personalized learning content at scale for your learners? In this episode of the Content Operations podcast, Alan Pringle and Bill Swallow share how structured content can transform your L&D content processes. They also address challenges and opportunities for creating structured learning content. There are other people in the content creation world who have had problems with content duplication, having to copy from one platform or tool to another. But I will tell you, from what I have seen, the people in the learning development space have it the worst in that regard — the worst. — Alan Pringle Related links: The challenges of structured learning content (podcast) DITA and learning content Rise of the learning content ecosystem with Phylise Banner (podcast) Flexible learning content with the DITA Learning and Training specialization Building an effective content strategy is no small task. The latest edition of our book, Content Transformation is your guidebook for getting started. LinkedIn: Alan Pringle Bill Swallow Transcript: Disclaimer: This is a machine-generated transcript with edits. Introduction with ambient background music Christine Cuellar: From Scriptorium, this is Content Operations, a show that delivers industry-leading insights for global organizations. Bill Swallow: In the end, you have a unified experience so that people aren’t relearning how to engage with your content in every context you produce it. Sarah O’Keefe: Change is perceived as being risky, you have to convince me that making the change is less risky than not making the change. Alan Pringle: And at some point, you are going to have tools, technology, and process that no longer support your needs, so if you think about that ahead of time, you’re going to be much better off. End of introduction AP: Hey, everybody, I’m Alan Pringle. BS: I’m Bill Swallow. AP: And today, Bill and I want to talk about structured content in the learning and development space. I would say, the past two years or so, we have seen a significantly increased demand of organizations who want to apply structured content to their learning and development processes, and we want to share some of the things those organizations have been through and what we’ve learned over the past few months, because I suspect there are other people out there who could benefit from this information. BS: Oh, absolutely. AP: So let’s talk about, really, the drivers, what are the things that people, content creators in the learning development space, what’s driving them to it? One of them off the bat is so much content, so, so very much content, on so many different delivery platforms. That’s one that I know of immediately, what are some of the other ones? BS: Oh, yeah, you have just the core amount of content, the number of deliverables, and the duplication of content across all of them. AP: That is really the huge one, and I know there are other people in the content creation world who have had problems with content duplication, having to copy from one platform or tool to another. But I will tell you, from what I have seen, the people in the learning development space have it the worst in that regard—the worst. BS: Didn’t they applaud you when you showed up at a conference with a banner that said end copy, paste? AP: Pretty much, it’s true. That very succinct message raised a lot of eyebrows, because they are in the position, unfortunately, in learning and development, having to do a lot of copying and pasting, and part of the reason for that copying and pasting is, a lot of times, the different platforms that we’ve mentioned, also, different audiences. I need to create this version for this region, or this particular type of student at this location, so they’re copying and pasting over and over again to create all these variants for different audiences, which becomes unmanageable very quickly. BS: Yeah, copy, pasting, and then, reworking. And then, of course, when they update it, they have to copy, paste, and rework again to all the other places it belongs, and then, they have to handle it in however many languages they’re delivering the training in. AP: So now, everything is just blown up. I mean, how many layers of crap, and I’m just going to say it, do these people have to put up with? And there are many, many, many. BS: Worst parfait ever. AP: Yeah, no, that is not a parfait I want to share, I agree with you on that. So let’s talk about the differences between, say, the techcomm world and the learning and development world and their expectations for content. Let’s talk about that, too, because it is a different focus, and we have to address that. BS: So techcomm really is about efficiency and production, so being able to amass quite a wide mass of content and put it out there as quickly as possible, or put it out there as efficiently as possible. Learning content kind of flips that on its head, and it wants to take quality content and build a quality experience around it, because it’s focused on enabling people to learn something directly. AP: And techcomm people, we’re not saying you’re putting out stuff that is wrong or half ass. That is not what we mean, I want to be real clear here. What we mean is, there is a tendency to focus on efficiency gains, and getting that help set, getting that PDF, getting that wiki, whatever thing that it is that you’re producing, getting that stood up as quickly as possible, whereas on the learning side, speed is not usually the thing that you’re trying to use to sell the idea of structured content. I don’t think that’s going to win a lot of converts in the learning space. I do think, however, you can make the argument, if you create this single source of truth so you can reuse content for different audiences, different locations, different delivery platforms, and you’re using the same consistent information across all of that, you are going to provide better learning outcomes, because everybody’s getting the same information. Regardless of what audience they are or what platform that they’re learning, whether it’s live instructor-led training, something online, whatever else, you’re still getting the correct same information, whereas if you were copying and pasting all that, you might’ve forgot to update it in one place as a content creator, and then, someone ends up getting the wrong information, a student, a learner, and that’s when you’re not in the optimal learning experience situation. BS: Right, and it’s not to say that every single deliverable gets the exact same content, but they get a slice from the same shared centralized repository of content so that they’re not rewriting things over and over and over again. And they’re still able to do a lot of high-quality animations, build their interactives, put together their slide presentations, everything like that, but use the content that’s stored centrally rather than having to copy and paste it again and again and again. AP: Yeah, and let’s talk about, really, the primary goals for moving to structure content for learning and development folks. We’ve already talked about reuse quite a bit, that’s a big one. Write it one time, use it everywhere, and that also leads to creating profiling, different audiences, content for different audiences. BS: Right, I mean, these goals really are no different than what you see in techcomm, and what techcomm has been using for the past 15, 20, 25 years. It is that reuse, that smart reuse, so write it once, use it everywhere, no copy paste, having those profiling attributes and capabilities built in so that you can produce those variants for beginner learners versus expert learners versus people in different regional areas where the procedure might be a little bit different, producing instructor guides as well as learner guides. All of these different ways of mixing and matching, but using the same content set to do that. AP: Yeah, it’s like one of our clients said, and I have to thank them forever for bringing this up, they were bogged down in a world of continuous copying and pasting over and over and over again, and maintaining multiple versions of what should’ve been the same content, and they said, quote, “We want to get off the hamster wheel.” And that is so true and so fitting, and we probably owe them royalties for saying this over and over again, because such a good phrase. But it really did capture, I think, a big frustration that a lot of people in the learning and development space have creating content, because they do have to maintain so many versions of content. BS: And those versions likely are stored in a decentralized manner, so they could be on multiple different servers, they could be on multiple different laptops or PCs, they could be on thumb drives in some random drawer that are updated maybe once every two, three years. So being able to pull everything together into a central repository and structure it so that it can be intelligently reused and remixed, there’s so many benefits to that. AP: Yeah, and in regard to the remixing, the bottom line is, you want the ability to publish to all your different platforms. I believe the term people like to use is omnichannel publishing, so you basically can do push-button publishing to basically any delivery need that you have, whether it’s an instructor versus student guide for training you’re having live, e-learning, even scripts for video. Even when you’re dealing with a lot of multimedia content, there is still text involved, underpinnings of that content, audio and video, there’s still probably bits and pieces of that, that can come from your single source of content, because at the core of it, it’s text-based, even though if the delivery of it is a video or audio. BS: Now, we’ve had structured content for a good couple decades, at least- AP: At least, yeah. BS: … but there really is a reason why the learning world really hasn’t latched onto it completely, and it really comes down to the different types of content that they need to produce versus what traditionally a techcomm group would do. So right off the bat, there are all the different tests, quizzes, and so forth, all the assessments that are built into a learning curriculum. There was never really anything built to handle those in traditional structured authoring platforms in schemas. AP: And there are solutions now that will let you handle assessments and different types of questions, and things like that. BS: But the whole approach to producing learning content, it’s quite similar to techcomm and to other classic content development, but it’s also quite unique in its own right, and we do have to make sure that all of those different needs, whether it be the assessments, any interactives that need to be developed, making sure that you tie in a complete learning plan, and perhaps even business logic to your content, making sure all that can be baked in intelligently so that we’re able to produce the things that we need to produce for trainers. AP: Yeah, and now, especially, you have to be able to create content that integrates easily with the learning management system, which has its own workflows, it’s got tracking, it tracks progress, it scores quizzes, it keeps track of what classes you’ve taken, prerequisites, all of that stuff, that is a whole delivery ecosystem, and structured content can help you communicate with an LMS and create content that is LMS friendly by baking in a lot of the things that you just talked about. BS: And the content really does boil down to a more granular and targeted presentation to the audience rather than techcomm, which is more of a soup to nuts, kind of everything in the kitchen sink approach to offer. AP: Yeah, and then, there’s also the whole live delivery aspect, that is not something that’s really part of techcomm at all. BS: I wouldn’t want someone there reading a manual to me. AP: No, nor would I. Well, it might be a good way to treat insomnia, but that’s not what we’re here for. But you do have to consider, the assessments are a big difference from a lot of other content that is a good fit for the structure world, and then, the possibility of live instruction, that’s also another big difference, which, still, there are structured content solutions that can help you with both of those very distinct learning and development content situations. So I think it’s fair to say, based on talking to a lot of people at conferences focused on learning, and a lot of our clients, that the traditional way of creating learning and development content, it is not scalable. The copy and paste angle in particular is just not sustainable in any way, shape, or form. BS: No, you have so many hours in a day, so if you need to start producing more, you really need to start adding more people. And you add more people, then you have the likelihood that more things could go wrong with the content, or the content could get- AP: Will go wrong. BS: … could get out of sync with itself. AP: Yeah. Well, let’s talk also a little more about some of the challenges. We’ve talked about the interactivity, how that and the assessments, that’s something that’s kind of particular that you have to solve for in the learning space. Let’s talk about the P word, PowerPoint. BS: PowerPoint. Yeah, being able to pull focus slides together, which really would likely have a very small subset of a course’s content built within them, unless you’re producing a wall of text per PowerPoint. Those are quite unique to the space, so you don’t see much in techcomm where things are delivered via PowerPoint, or you hopefully don’t. AP: No, PowerPoint is great because it’s wide open and you can do a lot of things with it, PowerPoint is bad because it’s wide open and you could do a lot of things with it. That’s the problem with PowerPoint. BS: And a template’s only as useful as those who follow it. AP: Exactly. And now, you mentioned templates, structure content is a way to templatize and standardize your content, and I’m sure that can rub people the wrong way. My slides need to be special, this, that, and the other. There’s a continuum here of, I want to do whatever I want to the point of sloppy, or I can do things within this particular set of confines so there is consistency. And again, I think it’s fair to say, providing consistency for different learners with slide decks, that is going to make some better outcomes instead of a free-for-all, I can do whatever I want scenario. And I’m sure there are people out there who are going to kick and scream and disagree with me, but that’s a fight we’re just going to have to have folks. BS: Well, no, it provides us a consistent experience throughout, rather than having some jarring differences from lesson to lesson or course to course. AP: Yeah, yeah, and I think there’s one thing, too, that, in addition to the PowerPoint angle, with the learning and development space, there is this focus on, we need to create, this thing went off, that thing went off, and this other thing went off. There’s still standardization you can do among your different delivery targets that will streamline things, create consistency, and therefore, a better learning experience. I do believe that’s true, even though some people at first in particular can find it very confining. BS: Oh, right, I mean, it just takes the development of the end user experience, I don’t want to say completely out of the learning content developer’s hands, but it kind of frees them up to better frame the content for the lesson rather than worrying about the fit and finish of the product. AP: Yeah, and let’s focus now on some of the options out there in the structure content world for learning and development content. There’s several out there, let’s talk about what’s on the table. BS: It comes down to two different types of systems, one would be a learning component management system, so it’s a system that’s more built for learning content specifically. AP: Yeah, I would say it’s purpose built, I agree, yeah. BS: Yeah, and it functions the same way as a lot of, I guess what we would call the traditional techcomm component content management systems do, where you’re able to develop in fragmented pieces, in a structured way, in a centralized manner, and intelligently reuse and remix all of these different components to produce some type of deliverable. AP: Right, so you can therefore, within this system, set up things for different locations, different audiences, whatever else. And if you were moving into an LCMS or one of the other solutions we’re talking about, you are also going to make localization and translation much more efficient, and you’ll get stuff turned around in other languages for other locales much more quickly. So we’ve got the LCMS’s which are more proprietary, and then, on the flip side of that, let’s talk about DITA. BS: So DITA does provide you with a decent starting point for developing your content, and we’ve helped several clients do this already, but a lot of the tools that are out there on the flip side, where the LCMS is targeted at developing learning content, a lot of the tools for DITA aren’t, so it requires a lot of customization on the tool chain, as well as in the content model, to get things up and running. However, DITA does give you an easier point of integration with any work that is being produced by your techcomm peers. AP: Yeah, I do think it’s fair to say it’s a little more extensible, but the mere fact it is an open standard as an extensible means that it may take some configuring to make it exactly what you need it to be. And like Bill was saying, DITA has some custom structure that is a very good fit, it is specifically for learning and training, and you can further customize those customizations to match what you need. I will say, I think some of the assessment structures are not as robust as they should be, and we’ve had to customize those for some clients. So that’s another thing that you would have to kind of think about when you’re trying to make this decision, do I need to go with an LCMS, or do I want to go with DITA and a component content management system, and understand that I’m going to have to make some adjustments to make it more learning and development friendly? BS: No matter which way you slice it, though, moving to any kind of a structured repository in a structured system really starts to open things up from a back end production point of view, while not necessarily forgoing a lot of the experience-driven design that goes into producing those different learning deliverables. It is a way to kind of become more efficient, and as Alan mentioned, avoid the copy and paste, which can be a nightmare to maintain over time. AP: And at the same time, you do not have to throw out your standards for the quality of the content and the quality of the learning experience. You want to have structure, support, and bolster, and maintain those things, and don’t look at it as something that is going to degrade those things, because when used correctly, it can really help you maintain that level of quality and consistency that you really need for an outstanding learning experience. And with that, Bill, I think we can wrap up. Thank you very much. BS: Thank you. Outro with ambient background music Christine Cuellar: Thank you for listening to Content Operations by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. Questions about this podcast? Let’s talk! " * " indicates required fields Your name (required) * Your email (required) * Your company Subject (required) * Consulting requestSchedule a meetingTrainingOther Your message * Data collection (required) * I consent to my submitted data being collected and stored. Phone This field is for validation purposes and should be left unchanged. 41726 /* <![CDATA[ */ gform.initializeOnLoaded( function() {gformInitSpinner( 14, 'https://www.scriptorium.com/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery('#gform_ajax_frame_14').on('load',function(){var contents = jQuery(this).contents().find('*').html();var is_postback = contents.indexOf('GF_AJAX_POSTBACK') >= 0;if(!is_postback){return;}var form_content = jQuery(this).contents().find('#gform_wrapper_14');var is_confirmation = jQuery(this).contents().find('#gform_confirmation_wrapper_14').length > 0;var is_redirect = contents.indexOf('gformRedirect(){') >= 0;var is_form = form_content.length > 0 && ! is_redirect && ! is_confirmation;var mt = parseInt(jQuery('html').css('margin-top'), 10) + parseInt(jQuery('body').css('margin-top'), 10) + 100;if(is_form){jQuery('#gform_wrapper_14').html(form_content.html());if(form_content.hasClass('gform_validation_error')){jQuery('#gform_wrapper_14').addClass('gform_validation_error');} else {jQuery('#gform_wrapper_14').removeClass('gform_validation_error');}setTimeout( function() { /* delay the scroll by 50 milliseconds to fix a bug in chrome */ }, 50 );if(window['gformInitDatepicker']) {gformInitDatepicker();}if(window['gformInitPriceFields']) {gformInitPriceFields();}var current_page = jQuery('#gform_source_page_number_14').val();gformInitSpinner( 14, 'https://www.scriptorium.com/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery(document).trigger('gform_page_loaded', [14, current_page]);window['gf_submitting_14'] = false;}else if(!is_redirect){var confirmation_content = jQuery(this).contents().find('.GF_AJAX_POSTBACK').html();if(!confirmation_content){confirmation_content = contents;}jQuery('#gform_wrapper_14').replaceWith(confirmation_content);jQuery(document).trigger('gform_confirmation_loaded', [14]);window['gf_submitting_14'] = false;wp.a11y.speak(jQuery('#gform_confirmation_message_14').text());}else{jQuery('#gform_14').append(contents);if(window['gformRedirect']) {gformRedirect();}}jQuery(document).trigger("gform_pre_post_render", [{ formId: "14", currentPage: "current_page", abort: function() { this.preventDefault(); } }]); if (event && event.defaultPrevented) { return; } const gformWrapperDiv = document.getElementById( "gform_wrapper_14" ); if ( gformWrapperDiv ) { const visibilitySpan = document.createElement( "span" ); visibilitySpan.id = "gform_visibility_test_14"; gformWrapperDiv.insertAdjacentElement( "afterend", visibilitySpan ); } const visibilityTestDiv = document.getElementById( "gform_visibility_test_14" ); let postRenderFired = false; function triggerPostRender() { if ( postRenderFired ) { return; } postRenderFired = true; jQuery( document ).trigger( 'gform_post_render', [14, current_page] ); gform.utils.trigger( { event: 'gform/postRender', native: false, data: { formId: 14, currentPage: current_page } } ); gform.utils.trigger( { event: 'gform/post_render', native: false, data: { formId: 14, currentPage: current_page } } ); if ( visibilityTestDiv ) { visibilityTestDiv.parentNode.removeChild( visibilityTestDiv ); } } function debounce( func, wait, immediate ) { var timeout; return function() { var context = this, args = arguments; var later = function() { timeout = null; if ( !immediate ) func.apply( context, args ); }; var callNow = immediate && !timeout; clearTimeout( timeout ); timeout = setTimeout( later, wait ); if ( callNow ) func.apply( context, args ); }; } const debouncedTriggerPostRender = debounce( function() { triggerPostRender(); }, 200 ); if ( visibilityTestDiv && visibilityTestDiv.offsetParent === null ) { const observer = new MutationObserver( ( mutations ) => { mutations.forEach( ( mutation ) => { if ( mutation.type === 'attributes' && visibilityTestDiv.offsetParent !== null ) { debouncedTriggerPostRender(); observer.disconnect(); } }); }); observer.observe( document.body, { attributes: true, childList: false, subtree: true, attributeFilter: [ 'style', 'class' ], }); } else { triggerPostRender(); } } );} ); /* ]]> */ The post Transform L&D experiences at scale with structured learning content appeared first on Scriptorium .…
C
Content Operations

1 Creating content ops RFPs: Strategies for success 22:17
22:17
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut22:17
In episode 179 of the Content Strategy Experts podcast, Sarah O’Keefe and Alan Pringle share the inside scoop on how to write an effective request for a proposal (RFP) for content operations. They’ll discuss how RFPs are constructed and evaluated, strategies for aligning your proposal with organizational goals, how to get buy-in from procurement and legal teams, and more. When it comes time to write the RFP, rely on your procurement team, your legal team, and so on. They have that expertise. They know that process. It’s a matter of pairing what you know about your requirements and what you need with their processes to get the better result. — Alan Pringle Related links: Survive the descent: planning your content ops exit strategy (podcast) The business case for content operations (white paper) Content accounting: Calculating value of content in the enterprise (white paper) Building the business case for content operations (webinar) LinkedIn: Sarah O’Keefe Alan Pringle Transcript: Disclaimer: This is a machine-generated transcript with edits. Alan Pringle: Welcome to the Content Strategy Experts Podcast brought to you by Scriptorium. Since 1997, Scriptorium has helped companies manage, structure, organize, and distribute content in an efficient way. In this episode, we talk about writing effective RFPs. A request for a proposal, RFP, approach is common for enterprise software purchases, such as a component content management system, which can be expensive and perhaps risky. Hey everybody, I am Alan Pringle. Sarah O’Keefe: And I’m Sarah O’Keefe, hi. AP: So Sarah, we don’t sell software at Scriptorium, so why are we talking about buying software? SO: Well, we’re talking about you, the client buying software, which is not always, but in many cases, the prerequisite before we get involved on the services side to configure and integrate and stand up the system that you have just purchased to get you up and running. And so, because many of our customers, many most, nearly all of our customers are very, very large, many of those organizations do have processes in place for enterprise software purchases that typically either strongly recommend or require an RFP, a request for proposal. AP: Which let’s be very candid here. Nobody likes them. Nobody. SO: No, they’re horrible. AP: Vendors don’t like them. People who have to put them together don’t like them, but they’re a necessary evil. But there things you can do to make that necessary evil work for you. And that’s what we want to talk about today. AP: So the first thing you need to do is do some homework. And part of that homework, I think, is talking with a bunch of stakeholders for this project or this purchase and teasing out requirements. So let’s start with that. And this is even before you get to the RFP itself. There’s some stuff you need to do in the background. And let’s talk about that a little bit right now. SO: Right, so I think, you know, what you’re looking to get to before you go to RFP is a short list of viable candidates, probably in the two to three range. I would prefer two, your procurement people probably prefer three to four. So, okay, two to three. And in order to get to that list of these look like viable candidates, as Alan’s saying, you have to do some homework. Step one, what are your hard, requirements that IT or your sort of IT structure is going to impose. Does the software have to be on premises or does it have to be software as a service? Nearly always these days organizations are hell bent on one or the other and it is not negotiable. Maybe you have a particular type of single sign-on and you have some requirements around that. Maybe you have a particular regulatory environment that requires a particular kind of software support. You can use those kinds of constraints to easily, relatively easily, rule out some of the systems that simply are not a fit for what your operating environment needs to look like. AP: And by doing that, you are going to reduce the amount of work in the RFP itself by doing this now. So you’re going to streamline things because you’ve already figured out, this candidate is not a good fit. So why bother them and why make work for ourselves having to work and correspond with the vendor that ends up not being a good fit. SO: Right, and if we’re involved in a process like this, which we typically do on the client side, so we engage with our customers to help them figure out how to organize an RFP process, right, we’re going to be strongly encouraging you to narrow down the candidate list to something manageable because the process of evaluating the candidates is actually quite time consuming on the client side. And additionally, it’s quite time consuming for the candidates, the candidate software companies to write RFP responses. So if you know for a fact that they’re not a viable candidate, you know, just do everybody a favor and leave them out. It’s not fair to make them do the work. AP: No, it’s not. And we’ve seen this happen before where a organization will keep a vendor in the process kind of as a straw man to strike down fairly quickly. It would be kinder and maybe more efficient to do that before you even get to the RFP response process, perhaps. SO: Yeah, and of course, again, the level of control that you have over this process may vary depending on where you work and what the procurement RFP process looks like. There are also some differences between public and private sector and some other things like that. But broadly, before you go to RFP, you want to get down to a couple of viable candidates, and that’s who should get your request for proposal. AP: Yeah, and when it does come time to write that RFP, do rely on your procurement team, your legal team. They have that expertise. They know that process. It’s a matter of pairing what you know about your requirements and what you need with that process to get the better result. And I think one of the key parts of this communication between you and your procurement team is about use case scenarios. So let’s talk about those a little bit because they’re fundamental here. SO: Yeah, so your legal team, your procurement team is going to write a document that gives you all the guardrails around what the requirements are and you have to be this kind of company and our contract needs to look a certain way and various things like that. We’re going to set all of that aside because A, we don’t have that expertise and B, you almost certainly as a content person don’t have any control over that. You’re just going to go along with what they are going to give you as the rules of the road in doing RFPs. However, somewhere inside that RFP it says, these are the criteria upon which we will evaluate the software that we are talking about here. And I think a lot of our examples here are focused on component content management systems, but this could apply to other systems whether it’s translation management, terminology, metadata, you know, all these things, all these content-related systems that we’re focused on. So, somewhere inside the RFP, it says, we need this translation management system to manage all of these languages, or we need this component content management system to work in these certain ways. And your goal as the content professional is to write scenarios that reflect your real world requirements that are unique to your organization. So if you are in heavy industry, then almost certainly you have some concerns around parts, about referencing parts and part IDs and maybe there’s a parts database somewhere and maybe there are 3D images and you have some concerns around how to put all of that into your content. That is a use case that is unique to you versus a software vendor who is going to have some sort of, we have 80 different variants of this one piece of software depending on which pieces and parts you license, and then that’s gonna change the screenshots and all sorts of things. So what you wanna do is write a small number of use cases. We’re talking about maybe a dozen. And those dozen use cases should explain, you know, as a user inside the system, I need to do these kinds of things. You might give them some sample content and say, here is a typical procedure and we have some weird requirements in our procedures and this is what they are. Show us how that will work in your system. Show us how authoring works. Show us how I would inject a part number and link it over to the parts database. Show us, you know, those kinds of things. So, the use case scenarios typically should not be, “I need the ability to author in XML,” right? AP: Or, “I need the ability to have file versioning,” things that every CCMS on the planet does, basically. SO: Right, somewhere there’s a really annoying and really long spreadsheet that has all those things in it, fine. But ultimately, that’s table stakes, right? They should not get to the short list unless you’ve already had this conversation about file versioning and the right class of system. The question now becomes, how do you provide a template for authors and what does it look like for authors to start from a template and do the authoring that they need to do? Is that a good match for how your authors need to or want to or like to work. So the key here from my point of view is don’t worry too much about the legalese and the process around the RFP, but worry a whole bunch about these use case scenarios and how you are going to evaluate all the different tools that you’re assessing against the use case scenarios. AP: Be sure you communicate those use case scenarios to your procurement team in a way they understand so they have a better handle on what you need because if everybody is kind of on the same page as far as those use cases go the clearer it’s going to be to communicate those things to the candidate vendors when they do get their hands on the RFP. SO: And I think as we’re going in or talking about going into a piece of software, there probably should already be some consideration around exit strategy, which Alan, you’ve talked about that a whole bunch. What does it mean to have an exit strategy and to evaluate that in the inbound RFP process? AP: It is profoundly disturbing to have to think about leaving a before you’ve even bought it, but it does, does behoove you to do that because you need a clear understanding of how you are going to transition outside of a tool before you buy it. So when that happens, when you come to a point where you have to do it, you have an understanding about how you can technically exit that tool. For example, how can you export your source files for your content? What happens when you do that? In what formats? These are part of the use cases that you’re talking about perhaps here too. So it really is so weird to have to think about something that’s probably years down the road, but it is to your advantage to do that at this point in the game. SO: Yeah, I mean, what’s the risk if something goes sideways or if your requirements change? This doesn’t have to be sideways. So you are in company A and you buy tool A, which is a perfect fit for what you’re doing. Company A merges with company B. Company B has a different tool and B is bigger than A. So B wins and you exit tool A as company A and you need to move all your content into tool B. Well, that’s a case where you made all the right decisions in terms of buying the software. You just didn’t account for a change in circumstances, as in B swooped in and bought you. So what does it look like to exit out of tool A? AP: Yeah, it doesn’t necessarily have to be the tool no longer works for us. It could be what you describe. There can be external factors that drive the need to exit, have nothing to do with bad fit or failure on anybody’s part. SO: So we have these use case scenarios and we’ve thought about exit, though this is entrance. AP: Or even before entrance, you haven’t even entered yet. SO: And so now you’re going to have a demo, right? The software vendor is going to come in and they’re going to show you all your use case scenarios. Well, we hope they’re going to show you your use case scenarios. Sometimes they wander in and they show you a canned demo and they don’t address your use cases. That tells you that they are not paying attention. And that is something you should probably take into account as you do your evaluation. AP: Yeah, and don’t get sucked in on a similar note. Don’t get sucked in by flashy things because that flash may blind you and very nicely disguise the fact that they can’t quite match one of your use cases. So look at this sparkly thing over here. Don’t fall for that. Don’t do it. Yeah. SO: Sparkles. So, okay, so we have our use cases and they are going to bring a, they, the software vendor is going to bring some sort of a demo person and they are going to demo your use cases and hopefully they’re going to do it well. So you sort of check those boxes and you say, okay, great, it works. I think the next step after that is not to buy the tool. The next step after that is to ask for a sandbox so that your users can trial it themselves. There is a big, big difference between a sales engineer or a sales support person who has done hundreds, if not thousands of demos going click, click, click, click, click, at how awesome this is. And your brand new user who has never used a system like this, maybe, trying to do it themselves. So user acceptance testing, get them into a not for production sandbox, let them try out some stuff, let them try out all of your use cases that you’ve specified, right? AP: It’s try before you buy is what we’re talking about here. Yep. SO: Mm-hmm. Yeah, I’ve just made a whole bunch of not friends among the software vendors because of course setting up sandboxes is kind of a pain. AP: It’s not trivial. SO: Yeah, but you’re talking to just one of two candidates, right? So it is not unreasonable. It is completely unreasonable if you just did a, know, a spray this thing far and wide and ask a dozen software vendors for input. That is not okay from my perspective. And when we’re involved in these things, we try very, very hard to get the candidate list down to, again, two or three at most because almost certainly you have requirements in there somewhere that will make one or another of the software tools a better fit for you. So we should be able to get it down to the reasonable prospect list. AP: And I think too, this goes back to efficiency. Having fewer people or fewer companies in this means you’re gonna have to spend less time per candidate system because you’ve already narrowed it down to organizations that are gonna be a better fit for you. So it’s gonna be more efficient for them because they’re not having to probably do as much show and tell because you’ve narrowed things down very specifically here in my use cases. Also for you as the tool buyer and your procurement team, you’re going to have less to do because you’re not having to talk to four, six candidates, which you should not be doing for an RFP, in my opinion. I know some people in procurement will probably disagree with that though. SO: Well, we’re just going to make everybody mad today. And while I’m on the topic of not making friends and not influencing people, I wanted to mention something that probably many of you as listeners are familiar with, which is something called the Enterprise Architecture Board. If you work in a company of a certain size, you probably have an EAB. And the EAB is kind of like the homeowners association of your company, right? They are responsible for standards and making sure that you occasionally mow the lawn and whatever else, whether there are other ridiculous rules the homeowners association set. But EABs, Enterprise Architecture Boards in a company context, are responsible for software purchases, software architecture, and looking at what kinds of systems are we bringing into this organization and usually how can we minimize that? How can we maintain a reasonable level of consistency instead of bringing in specialty solutions all over the place? Now, a CCMS, a component content management system is pretty much the definition of a specialty system. AP: It’s niche. Yeah. SO: Yep, and EABs in general willl take one look at it and say something very much like, “CCMS, no, we have a CMS. We have a content management system. We have SharePoint, just use that. We have Sitecore, just use that. We have fill in the blank, just use that.” And your job, if you have the misfortune to have to address an EAB, is that you need to explain why it is that the approved existing solutions within the company architecture do not meet the requirements of the thing that you are trying to do and because that one’s not hard. The and part is the hard part and it is worth the and they’re going to talk about TCO total cost of ownership. It is worth the effort and the risk and the complexity of bringing in another solution beyond the baseline CMS that they’ve already approved to solve the issues that you’ve identified for your content. This is difficult. I’ve spent a lot of quality time with the AABs and they’re literally their job is to say no. I mean, that is just flat out their job. Their job is to streamline and minimize and have as few solutions as possible. So if you have to deal with this kind of situation, you’re going to have some real challenges internally getting this thing sold. AP: Yeah, and while we’re making friends and influencing people with our various comments on this process today, one final thing I want to say before we wrap up is, that common courtesy goes a really long way in this process. When you have wrapped things up, you have made your selection. Be sure you also communicate that to the vendors you did not choose. SO: Yeah. AP: Too many times in RFP processes, there’s not the level of communication with the people who did not win. And it’s just common courtesy, let them know, no, we chose someone else. And if you’re feeling super polite, you might even tell them why this use case you didn’t quite hit. This is why we went with this organization if you choose to. So be nice and be courteous because I realize this is more of a professional business situation, but it still doesn’t hurt to tell someone exactly why you did what you did. SO: Yeah, and I know those of you in more on the government side of things, nonprofit, typically do have a requirement to notify on RFPs and even give reasons and all the rest of it. But on the corporate side, there’s typically not any sort of requirement to let people know, as Alan said. you know, people put a lot of work into these RFPs and a lot of pain. AP: Yeah. SO: And one last, last thing beyond you should notify people. I want to talk about RFP timing. So we’re rolling into the end of 2024 here. I fully expect that there will be RFPs that will come out on roughly December 15th, which will be due on something like January 1st. So in other words, “Hi vendors, please feel free to spend your holiday time filling out our RFP so that we can, you know, go into the new year with shiny RFP submissions.” AP: RUDE! SO: That is not polite. Don’t do that. It is extremely rude. And it signals a level of disrespect that from the vendor side of the process makes them perhaps less inclined to bend on some other things. So reasonable amount of time for the scope of work that you’re asking for. And holidays don’t count. AP: Yeah, exactly. to go back, I think we can kind of wrap this up and go back to what we were talking about. All of that legwork that you do upfront for this RFP process, your vendors, believe it or not, would generally appreciate it because it shows you’ve done the homework, you have thought about this, and you’re just not wildly flinging out asks with no money, no stakes behind those asks. And they will probably be much more willing to work with you and go that extra mile when you have done that homework. Is there anybody else that we need to tick off before we wrap up? SO: I think we covered our list. So I’ll be interested to see what people think of this one. So let us know, maybe politely, but let us know. AP: And I’ll wrap up before there’s violence that occurs. So thank you for listening to the Content Strategy Experts Podcast brought to you by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. The post Creating content ops RFPs: Strategies for success appeared first on Scriptorium .…
C
Content Operations

1 Pulse check on AI: December, 2024 (podcast) 19:41
19:41
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut19:41
In episode 178 of the Content Strategy Experts podcast, Sarah O’Keefe and Christine Cuellar perform a pulse check on the state of AI as of December 2024. They discuss unresolved complex content problems and share key considerations for entering 2025 and beyond. The truth that we’re finding our way towards appears to be that you can use AI as a tool and it is very, very good at patterns and synthesis and condensing content. And it is very, very bad at creating useful, accurate, net new content. That appears to be the bottom line as we exit 2024. — Sarah O’Keefe Related links: Pulse check on AI: May, 2024 (podcast) AI in the content lifecycle (white paper) The future of AI: structured content is key (webinar) Savor the season with Scriptorium: Our favorite holiday recipes LinkedIn: Sarah O’Keefe Christine Cuellar Transcript: Disclaimer: This is a machine-generated transcript with edits. Christine Cuellar: Welcome to the Content Strategy Experts Podcast brought to you by Scriptorium. Since 1997, Scriptorium has helped companies manage, structure, organize, and distribute content in an efficient way. In this episode, it’s time for another pulse check on AI. So our last check-in was in May, which in AI terms is ancient history, so today, Sarah O’Keefe and I are gonna be talking about what’s changed and how it can affect your content operations. Sarah, welcome to the show. Sarah O’Keefe: Hey Christine, thanks. CC: Yeah. So 2024, as we’re currently recording this 2024 is winding down. People are preparing for 2025. Throughout this year, we went to a lot of different conferences and events. Of course, everybody’s talking about AI. So Sarah, based on the events that you like just recently got back from, you finally get to be in your own house. What are your thoughts about what’s going on with AI in the industry right now? SO: There’s, still a huge topic of conversation. Lots of people are talking about AI, a huge percentage of presentations, you know, had AI in the title or referenced it or talked about it. With that said, it seems like we’re seeing a little more sort of real world, hey, here’s some things we tried, here’s what’s working, here’s what’s not working. CC: Mm-hmm. SO: And I’ll also say that we’re starting to see a really big split between the AI in regulatory environments, which would include the entire EU plus certain kinds of industries and the sort of wild, wild west of we can do anything. CC: Yeah. So do you feel like it sounds like, know, when AI first came onto the scene, there was mostly, you know, let’s just all adopt this right now. Let’s go for it full steam ahead, especially marketers as a marketer. can I can say that because we’re definitely gung-ho about stuff like that. It sounds like, the perspective has shifted to being more balanced overall. Is that what you would say? SO: Yeah, I mean, that’s the typical technology adoption curve, right? You know, have your your peak of inflated expectations, and then you have the I think it’s the valley. It’s not the valley of despair, but it’s something like that. But you know, you sort of go from this can do anything. This thing is so cool. Go, go, go, go, go to a more realistic. Okay, what can it actually do? And what you know, does the and this is true for AI or anything else? What can it do? What can’t it do? What does it do well? CC: Mm. SO: Where do we need to put some guardrails around it? What are some surprises in terms of things that are and are not working? CC: Yeah. And at some of the conferences we were at this year, our team had some things to say about AI as well. So we will link some of the recap blog posts we have in the show notes. Sarah, what are some of the things AI can’t do right now? are the still, what are, Sarah, what are some of the big concerns about AI that are still unanswered, unresolved? SO: So in the big picture, as we’re starting to see people roll out AI-based things in the real world, whether it’s tool sets or content ops or anything else, we’re starting to see some really interesting developments and some really interesting assessments. Number one is that when you look at those little AI snippets that you get now when you do a search and it returns a bunch of search, well, actually it returns a page of ads. CC: Yes. SO: And then some real results under the ads. And then above that, it returns an AI overview snippet. So those are surprisingly bad. You do a search on something that you know a little bit of something about and see what you get. And you will see content in there that is just flat wrong. I’m not saying it’s not the best summary. I’m saying it is factually incorrect, right? CC: Yeah, I hate them right now. SO: So those are surprisingly bad. And talking about search for a minute, which ties into your question about marketing, there’s some real problems now with SEO, with search engine optimization, because if I’m optimizing my content to be included in an AI overview that is A, wrong, and B, doesn’t actually give me credit, Pre-AI, those snippets that showed up would say, I sourced it from over here. CC: Mm-hmm. SO: And in many cases now, the AI overview is just like the sort of summary paragraph with no particular, there’s no citation. It doesn’t say where it came from. So what’s in it for me as a content creator? Why am I creating content that’s going to get taken over by the AI overview and then not lead to people going to my webpage, right? How’s that helped me? CC: Yeah. Yeah. SO: So there’s some real issues there, there’s a move in the direction of thinking about levels of information. So thinking about very superficial information. How much does a cup of flour weigh? That type of thing. That’s just a fact and you can get it pretty much anywhere, we hope. And then there’s deeper information. Why is it better to weigh flour than to measure it? By volume, if you’re a baker. CC: Yeah. SO: And what does it look like to use weights? And are there differences among different kinds of flours? And what are some of the things I should consider when I’m going in that direction? So one of those, know, flours, a cup of flour weighs 120, sorry, a cup of all-purpose flour weighs 120 grams is a useful fact. And I don’t know if I really care if people peruse that further or come to my website for more about flour. The deeper information, the more detailed discussion of, you know, whole wheat versus all-purpose versus European flours versus American flours and all these other kinds of things, that requires more in-depth information and that is not so subject to being condensed into an AI summary. So that distinction between, you know, quick and dirty information versus deeper information, information that goes into a topic, CC: Mm-hmm. SO: We have a huge problem with disinformation and misinformation with information that is just flat out not either not correct or because of the way AI tools work, is trivially easy to generate content at scale. Tons and tons and tons and tons and tons of content. And because it’s trivially easy, CC: Mm-hmm. SO: That means it’s also trivially easy for me to generate, for example, a couple thousand fake reviews for my new product or a couple thousand websites for my fake products. It we can fractionalize down the generation of content. CC: Yeah. SO: And the you know, the interesting part of this is that it implies that you could potentially, you know, we talk about doing A/B testing and marketing. You could do A/B/C/D/E/F/G testing pretty easily because you can generate lots and lots of variants and kind of throw a bunch of stuff against the wall and see what works. But the bad side of this is that you can generate fake news, fake information, fake content that is going to be highly, highly problematic from a content consumer trust point of view. And so that I think is the third piece that we’re looking at now that is going to be critical going forward. And that is information trust, content reputation or the reputation of content creators and credibility. CC: Mm-hmm. SO: So for those of you listening to this podcast, how do you know it’s really us? Do you know these are live humans actually recording this podcast versus you know there’s now the ability to generate synthetic audio and you can create a perfectly plausible podcast which is really hard to say unless probably your AI and then it can probably do it perfectly but our perfectly plausible podcasts are you know how do you know that what that what you’re receiving in terms of content, digital content in particular, is actually trustworthy. And so I think ultimately there’s going to be some, need to be some tooling around verification, around authenticity, around, you know, this was not edited. You know, in the same way that you want to be able to verify that a photo, for example, is an accurate record of what happened when that photo was taken. CC: Yeah. SO: And if I went in and photoshopped it and cleaned it up, then that’s something that should be acknowledged. By the way, for the record, we do record these things and we do edit them. We try to stay on the right side of just editing out dumb mistakes and not editing it in a misleading way. CC: Yeah, ums and ahs and yeah. SO: So it’s not like we record the whole thing from soup to nuts and never, you know, never break in and never edit things out because believe me, I’ve said some stuff that needed to be taken away. If you ever get the raw files, they are full of, I didn’t mean to say that. you might want to take that out. CC: Me too, so many times. Let me start over, that’s me a lot all the time. SO: Yeah, sorry. Starting over. OK, but the point is that when we put out a podcast, we are saying this is our opinion, this is our content, and we’re gonna stand behind it. Whereas if it’s synthetic or AI generated or AI generated by these non-humans, you can do these weird, let’s make a podcast out of a blog post, well, okay, but what’s the value of that and why would I trust that content? CC: Yeah. SO: So that I think is going to be the big question for the next couple of years is what does it look like to be a content creator in an AI universe and to have the ability or sorry to as the content consumer to have the ability to validate what you’re listening to or reading or seeing. CC: Yeah. And a point that you had brought up in, I believe it was the white paper that you authored back in 2023. One of the points in there was that, people are going to, because of this trust and credibility issue, people are going to have to start relying on companies and brands that they’re already familiar with for the information that they’re looking for rather than a search from scratch because, you know, search is so messed up right now. And that is something I’ve seen personally, like myself, I do it a lot more. I’ve seen that with friends and other contacts and stuff like that. That’s really what people are doing is they’re going to, you know, the source even for recipes. Recently, as I was looking for a recipe and instead of just Googling it like I used to because I’m so sick of the summarized AI search, I went to all recipes, you know, a place that I knew that I liked the recipes or I think Sally’s baking addictions or something like that. There’s a lot of different places like that that now I’ll just go there instead of, you know, a search from scratch. That’s… I don’t know how we’re gonna fix that problem, yeah, trust and credibility, that’s gonna be a huge one. SO: It’s a really good example though because if you search for a particular recipe, even say two years ago, you would get a certain set of results and then you would say, I’ve heard of that website and I’ll go there. Now you search on a recipe, I’m getting 20, 30, or 40 websites that I’ve never heard of that all seem to have posted exactly the same recipe. CC: Mm-hmm. SO: I, you know, do I trust them? Do I trust them not to be AI-generated? Do I trust them to remember to not, you know, recommend that I put gravel in my recipes? You know, maybe not. And so I’m doing the same thing you are, which is, you know, reverting to trusted sources, trusted brands that I know that have a reputation for producing good recipes. Now, the flip side of this is that content is disappearing. CC: Hmm. SO: So, I have an infamous triple chocolate cookie recipe, is really if you’re looking for a chocolate bar in the form factor of a cookie, that is what it is. It’s just stupid amounts of chocolate. CC: Mm-hmm. yes, that sounds amazing. SO: It’s they’re delicious. And I think we’re putting them in our our holiday post, which may or may not have gone live already. So keep an eye out for that. But here’s the thing. I have the recipe because I got it out of Food & Wine about 20 years ago and I have a paper cut out of it that I wrote, hand wrote Food & Wine 12/01 on. So it was December of 2001 and so I went to Food & Wine. I went searching for this recipe knowing that it was originally published by them. I can’t find it. It is not there. CC: Hmm. wow. SO: It is not in their database, or at least it didn’t come up in their database when I searched on the exact name of the recipe. I then searched that exact recipe name, you know, just generally on the Internet, and I found three or four or five different places that had it, but none of them credited where I got it from 20 years ago, which I’m pretty sure is the original, right? Because these are all much more recent sites. So there are digital copies out there floating around, but they are not the original recipe and they didn’t credit the original publisher. Now, I don’t know exactly where Food & Wine got it because all I did was cut out the recipe. didn’t cut out the article. It was probably the context around it. But what I’m now reduced to is that I have a paper copy stashed in my paper recipe book, right? And I took a photo of the paper copy and put it on my phone. So I have a sort of digital version, but it is literally a photograph of a printout, which is, it is 2024 and we are doing photographs of printouts, but I can’t find it or I can’t find the original online. CC: Yes. Yeah. That’s interesting. Why do you think that content has disappeared? Do you think it’s because of the breakdown of the content model where the AI engine is just eating what it’s already regurgitated a bunch of times? Do you think it’s that? Does an org pulled it for some reason or what do you think is the cause? SO: Well, I mean, my best guess is that their recipe database only goes back so far and they just said anything more than X years old doesn’t need to be in here. They had some similar recipes. So maybe, well, this one’s been updated. It’s a little more modern, whatever. But it was just, it was really troubling that I, even knowing what the source was, I couldn’t find it. CC: Yeah, that is troubling. So how can companies prepare knowing that this is our context, this is our landscape? What should we do to prepare for 2025 and beyond? Because it’s not just like next year. SO: Beyond yeah, okay. So first of all you have to understand your regulatory environment Because that is very different by country or by region the issues that the people in the EU are looking at or American companies that sell in the EU, right. CC: Mm-hmm. Yeah. SO: There’s an EU AI act, and there’s a whole bunch of guidance that goes along with that. So there’s some concerns there. Whereas here in the US specifically, we don’t have a lot of regulation around AI, if any. Mostly we lean on, well, if you put out something that’s incorrect, there’s potentially product liability. If you put out instructions that are wrong and people follow them and they get hurt or worse, then the product owner is probably liable for putting out wrong instructions. That’s kind of where our stuff lands. But as a content consumer, I think you have to do what you’re describing, Christine, and become very, very skeptical about your sources methods, right? Where’d you get this stuff? And do you trust the source that it came from? CC: Yes. SO: If you are a content creator, then looking at questions around AI, the questions become, how can I employ AI inside my content workflows in a responsible way that achieves the goals that I have and doesn’t get me in big trouble in whatever way? And there’s also the question of, if I’m a content creator and I know that my consumers, my customers, are going to be using AI to consume my content, then how do I optimize that for that? How do I prepare for that? So it looks very different if you’re a person writing, creating new content, versus you’re the person deploying a chat bot on your corporate website that’s going to go read through your content corpus versus the person actually using the chat bot versus you name it. So. CC: Yeah. SO: And then, you know, we’re talking about AI generally, but of course we have AI tooling and we also have generative AI and we have all sorts of different things going on. So it’s a very, very broad topic, but overall, you know, what’s the problem I’m trying to solve? Can I apply this tool in a useful way? And what are some of the guardrails that I need to employ to keep myself out of trouble? CC: Yeah, in one of our webinars from this year, from 2024, depending on when you’re listening to this podcast, Carrie Hane mentioned something along the lines of like, you know, when you’re dealing with AI, it’s such a huge topic. You need to break it down by what’s the purpose of what you’re trying to do and then tackle the problem that way. Okay. So to wrap up, Sarah, what are your final thoughts, wishes and or recommendations for the world as we enter this new era? Or I guess we’re in it, but as we try to recover. SO: So the very short, we’ll try and keep it short. I think when all this AI stuff hit us a year or two ago, business leaders generally were hoping that they could just use AI as a general-purpose solution. Fire all the people, use AI for all the things, cool. CC: Mm-hmm. SO: The truth that we’re grasping towards or finding our way towards appears to be that you can use AI as a tool and it is very, very good at patterns and synthesis and condensing content. And it is very, very bad at creating useful, accurate, net new content. That appears to be the bottom line as we exit 2024. CC: Yeah. Well, thank you very much for unpacking this with us because I know that, you know, things are changing so fast. It’s helpful to have people like you that have been in the industry, the content industry specifically for a really long time that can help, you know, figure out a way through all this and give some practical ideas. SO: Well, you know, in six months, we’ll just feed this podcast into the AI and tell it to fix it so that it remains accurate. And off we go. CC: Yeah, there we go. And then we’re done. SO: And we’re done. CC: Yeah. Thanks so much for being here today and for talking about this. SO: Yeah, anytime. CC: And thank you for listening to the Content Strategy Experts Podcast brought to you by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. The post Pulse check on AI: December, 2024 (podcast) appeared first on Scriptorium .…
C
Content Operations

1 Do enterprise content operations exist? 24:34
24:34
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:34
Is it really possible to configure enterprise content —technical, support, learning & training, marketing, and more —to create a seamless experience for your end users? In episode 177 of the Content Strategy Experts podcast, Sarah O’Keefe and Bill Swallow discuss the reality of enterprise content operations: do they truly exist in the current content landscape? What obstacles hold the industry back? How can organizations move forward? Sarah: You’ve got to get your terminology and your taxonomy in alignment. Most of the industry I am confident in saying have gone with option D, which is give up. “We have silos. Our silos are great. We’re going to be in our silos, and I don’t like those people over in learning content anyway. I don’t like those people in techcomm anyway. They’re weird. They’re focused on the wrong things,” says everybody, and so they’re just not doing it. I think that does a great disservice to the end users, but that’s the reality of where most people are right now. Bill: Right, because the end user is left holding the bag trying to find information using terminology from one set of content and not finding it in another and just having a completely different experience. Related links: The business case for content operations (white paper) Replatforming an early DITA implementation (case study) Hear Sarah speak about The reality of enterprise customer content at tcworld 2024! Hear Bill speak about the The challenges of replatforming , also at tcworld 2024. LinkedIn: Sarah O’Keefe Bill Swallow Transcript: Disclaimer: This is a machine-generated transcript with edits. Bill Swallow: Welcome to The Content Strategy Experts podcast brought to you by Scriptorium. Since 1997, Scriptorium has helped companies manage, structure, organize, and distribute content in an efficient way. In this episode, we talk about enterprise content operations. Does it actually exist? And if so, what does it look like? And if not, how can we get there? Hi, everyone. I’m Bill Swallow. Sarah O’Keefe: And I’m Sarah O’Keefe. BS: And Sarah, they let us do another podcast together. SO: Mistakes were made. BS: So today we’re talking a little bit about enterprise content operations. If it exists, what it looks like. If it doesn’t, why doesn’t it exist? What can people do to get there? SO: So enterprise content ops, I guess first we have to define our terms a little bit. Content operations, content ops is the system that you use to manage your content. And manage not the software, but how do you develop it, how do you author it, how do you control it, how do you deliver it, how do you retire it, all that stuff. So content ops is the overarching system that manages your content lifecycle. And when we look at content ops from that perspective, and of course we’re generally focused on technical content, but when we talk enterprise content ops, it’s customer-facing content, which includes techcomm, but also learning content, support content, product data potentially, and some other things like that. And ultimately, when I look at this, again bringing the lens back or going back to the 10,000-foot view, we have some enterprise solutions but only on the delivery side. The authoring side of this is basically a wasteland. So I have the capability of creating technical content, learning content, support content, and putting them all into what appears to be some sort of a unified delivery system. But what I don’t really have is the ability to manage them on the back end in a unified way, and that’s what I want to talk about today. BS: So those who are delivering in that fashion, so being able to provide customer-facing information in a unified way, as far as their system for content ops goes, it’s more, I would say, human-based. So it’s a lot of workflow. It’s a lot of actual management of content and management of content processes outside of a unified system. SO: So almost certainly they don’t have a unified system for all the content, and we’ll talk about why that is I think in a minute. It’s not necessarily human-based, it’s more that it’s fragmented. So the techcomm group has their system, and the learning group has their system, and the support team has their system, et cetera. And then what we’re doing is we’re saying, okay, well once you’ve authored all this stuff in your Snowflake system, then we’ll bring it over to the delivery side where we have some sort of a portal, website portal, content delivery CDP that puts it all together and makes it appear to the end user that those things are all in some sort of a, it puts it in a unified presentation. But they’re not coming from the same place, and that causes some problems on the backend. BS: Right, and ultimately the user of that content doesn’t really care if it’s a unified presentation. They just want their stuff. They don’t want to have a disjointed experience, and they want to be able to find what they’re looking for regardless of what type of content it is. SO: Right, and the cliche is “don’t ship your org chart,” which is 100% what we’re doing. And so let’s talk a little bit about what does that mean, what are the pre-reqs? So in order to have something that appears to me as the content consumer to be unified, well for starters, you mentioned search. I have to have search that performs across all the different content types and returns the relevant information. And what that usually means is that I have to have unified terminology. I’m using the same words for the same things in all the different systems. And I need unified taxonomy, classification system metadata so that when I do a search, everything, and maybe I’m categorizing or I’m classifying things down and filtering, that when I do that filtering, that the filtering works the same way across all the content that I’ve put into the magic portal. So taxonomy and terminology are the things that’ll make your search, relatively speaking, perform better. So we have this on the delivery side and that’s okay-ish, or it can be, but then let’s look at what we’re doing on the authoring side of things because that’s where these problems start. BS: So what do they start looking like? SO: Well, maybe let’s focus in on techcomm and learning content specifically. We’ll just take those two because if I try and talk about all of them, we’re going to be here for days and nobody wants that. All right, so I have technical content, user guides, online help, quick snippets, how-tos. And I have learning, training content, e-learning, which is enabling content, I’m going to try and teach you how to do the thing in the system so that you can get your job done. Now, let’s go all the way back to the world where we have an instructional designer or a learning content developer and a technical content developer. So for starters, almost always those are two different people, just right off the bat. And instructional designers tend to be more concerned with the learning experience, how am I going to deliver learning and performance support to the learner? And the technical writers, technical content people tend to be more interested in how do I cover the universe of what’s in this tool set, or this product, and cover all the possible reasonable tasks that you might need to perform, the reference information you need, the concepts that you need? It’s a lot of the same information. It’s there’s a slightly different lens on it. And in the big picture, we should be able to take a procedure out of the technical content, step one, step two, step three, step four, and pretty much use that in a learning context. In a learning context, it’s going to be, hey, when you arrive for your job at the bank every morning you need to do things with cash that I don’t understand. And here’s a procedure, and this is what you’re going to do, steps 1, 2, 3, 4, 5, and you need to do them this way and you need to write them down, and it tends to be a little more policy and governance focused, but broadly it’s the same procedure. So there should be the opportunity to reuse that content. And big picture, high-level estimate is probably something like 50% content overlap. So 50% of the learning content can or should be sourced from the technical content. The technical content is probably a superset in the sense that the technical content covers, or should cover, all the things you can do, and training covers the most common things or the most important things that you need to do. It probably doesn’t cover a hundred percent of your use cases. Okay, so now let’s talk about tools. BS: Right because I was going to say these two people, the technical writer and the training developer, they are using, at least historically, two very different sets of tools to get their job done. SO: Right. So unified content solutions, without getting into too many of the specifics, which will get me in big trouble, basically the vendors are working on it, but they’re not there yet. There’s a lot of point solutions. There’s a lot of, oh yes, we have a solution for techcomm and we have a solution for learning and we have a delivery solution, but there’s not a unified back end where you can do all this work. And some of the vendors have some of these tools in their stable, some of them don’t. But from my point of view, it doesn’t really make a whole lot of difference whether you buy two-point solutions from separate vendors or from the same vendor because right now they’re disconnected. BS: They’re two-point solutions. SO: Yeah, they’re all point solutions. So it’s not good. And then that brings us to how can we unify this today? What can we do and what kind of solutions are our customers building or are we building with our customers? So a couple of things here. Option A is you take your structured content solution and you say, “Okay, learning content people, we’re going to put you in structured content. We’re going to move you into the component content management system. We’re going to topicify all your content, and we’re basically going to align you with the techcomm toolset and make that work.” We have a few customers doing that. It works well for learning content developers that are willing to prioritize the document structure and process over the flexibility in the downstream learning experience. BS: Right. SO: That’s a small set of people. Most learning content developers are not willing to prioritize efficiency and structure over delivery, which I think is actually the root cause. BS: Right. Now, those who are doing this, they are seeing some benefit in being able to produce a wide variety of their training deliverables from that unified source. But again, it comes back to how willing people are to give up the flexibility that they have in developing course content. SO: We can talk about big picture and we can talk about all the things, but this decision, this approach 100% of the time comes down to how badly do you want to be able to flail around in PowerPoint. And if having the ability to put random things in random places on random slides is critical, then this solution will not work. BS: So on the flip side, you would then look to maybe somehow connect your technical communication system to your learning repository. SO: Right. So you take your techcomm content and you treat it as a data source essentially for your learning content, and you just flow it into the learning authoring environment. It turns out that’s hard. BS: It’s very hard. SO: Super difficult. It’s difficult to get your structured content out into a format that the learning content system can accept in a reasonable manner. BS: And if your content is highly structured, you’re likely losing a lot of semantic data along the way to get it there. SO: Yeah, you lose a lot, but it’s just bad. And ultimately, this almost always lands, I mean we talk about flow it in there, but ultimately this almost always means that you’re going to be copying and pasting and reformatting and re-reformatting, and it’s just terrible. BS: So more often than not, we’re not seeing this level of unification then. SO: Yeah, I mean, are you connecting your techcomm and you’re learning in a structured environment? A few people, yes. And for the right use case, it’s great. Or flow the techcomm content down into the learning environment, but ultimately not worth it, we’ll just copy and paste. So in terms of unification, basically none of the above, right? BS: Mm-hmm. So how would people get there? SO: So there’s a couple of options. The probably most common one is some sort of a DIY solution. We’re going to find a way to glue these systems together. We’re going to find a workflow that involves converting the techcomm content, which usually is created first and move it into the learning content. Again, for the right group, for the right environment, unifying everything in a structured authoring environment makes a lot of sense. I think ultimately that’s where it’s going to land, but the structured content systems need to do some work to make themselves into what amounts to a reasonable viable authoring solution for the learning content people. Basically the learning content people are not willing to put up with the shenanigans that ensue in order to use a structured content system. And I’m not even sure they’re wrong, right? BS: Yeah. SO: They’re just saying, “No, this is terrible and we’re not doing it.” Okay, well, that’s fair. So either you tinker and put it all together in some way. Option B is wait for the vendors, wait for the vendors to fix this problem, fix this requirement, and deliver some systems that have a solution here. And it’ll be a year or two or five or 20, and eventually they’ll get to it. You can go with a delivery-only solution, so we’re only going to solve this on the delivery side. If you do that, you really, really, really, really need an enterprise-level taxonomy and terminology project group. BS: Absolutely. SO: You’ve got to get that aligned. You cannot go around having half your text say entryway, and half your text say hallway, half your text says study, and half your text says den. And I’m halfway down a clue reference, was it the wrench or the outlet? No, no, no, okay. You have to get your terminology in alignment. You must because otherwise people search on oven and it doesn’t return range because those are in fact… Well, okay, they’re not exactly the same thing, but close enough, so those types of things. So you’ve got to get your terminology and your taxonomy in alignment. Most of the industry, like most of the people out there that are doing techcomm and learning content, I am confident in saying have gone with option D, which is give up. Just don’t do it. Just don’t bother. We have silos. Our silos are great. We’re going to be in our silos, and I don’t like those people over in learning content anyway. I don’t like those people in techcomm anyway. They’re weird. They’re focused on the wrong things, says everybody, and so they’re just not doing it. I think that does a great disservice to the end users, but that’s the reality of where most people are right now. BS: Right, because the end user is left holding the bag there trying to be able to find information using terminology from one set of content and not finding it in another and just having a completely different experience. SO: They make it a you problem. BS: Yeah. So if you’re seeing opportunities to unify content operations in your organization, what are some key ways of communicating that up so that you can begin to get some funding, some support, some executive level buy-in to do these things? SO: The technology problem is hard. Putting everybody in an actual unified authoring environment is a really hard problem. So I think what you want to do is go for the easier solutions where you can get some wins. And the easier solutions where you can get some wins are consistent terminology across the enterprise. So we’re going to have some conversations about terminology and what we need to do in terms of terminology, and everybody’s going to agree on the words we’re going to use. Taxonomy, what does our classification system look like? What are the names for our products and how do we label things so that when we deliver all these different content chunks, they’re coming from all these different systems, we can bring them into alignment? I mean, you can do the work on the back end to align taxonomy or you can do it on the delivery side to say these things are synonyms. So there are some ways of addressing this even when you get down into the delivery end of things. But I think what you want to do is start thinking about the things… Oh, and translation management, which ties into both terminology and taxonomy. I think you want to start maybe with those things and then slowly work your way upstream, like a salmon, avoiding the bears on the… Okay, you’re going to try and work your way upstream towards the authoring. Because ultimately if you look at this from an efficiency point of view, it would be much, much more efficient to have unified authoring and put it all together. It’s just that right now today, that’s a heavy lift and it only makes sense in certain environments. So what can we do to prepare for that so that when we do get to that point and those tools do start to unify a little bit better, we’ve done the legwork that’ll make it easier to make that transition as we go? BS: Right. So it’s spending the effort to unify as much as you can the content and the language and the organization, as well as trying to keep pace with where I guess all of these different industry tools are going and making sure that you are making improvements in the right direction. So if you’re thinking about structured content, that you are keeping an open mind as to where and how I guess these other groups can start leveraging what you’re using and vice versa. And I guess talking with the other groups in your organization. So if you’re in techcomm, then talk to the training group, see what they’re doing, see what their plan is, what’s their five-year roadmap? Are they looking at certain technologies? How might that play into your development, and vice versa, being able to share that information. SO: And I know, Bill, you’re doing a session on re-platforming at tcworld this November 2024. And when you’re thinking about re-platforming, what are some of the factors that you should be looking at there that tie into this? BS: Well, it directly plays into that next step of we have a platform on the techcomm side, we bought it 12 years ago, it served our needs. But the training group, let’s say, has been talking and they have this other system that they’re not too happy with, and they want to see if they can start sharing our content. Well, then you have an open conversation to say, “Okay, how can we get to a shared solution, what do these requirements look like,” and go ahead and pick a system that kind of meets both requirements. But then you have that heavy lift of just saying, “Okay, so now we have these two different old systems and we need to dump our content, and I use that very generally, into the new system, so that everyone from those two groups can now author in the same place. SO: And I’m thinking as you’re evaluating these systems, all other things being equal, which they are not, but all other things being equal, you would look for the one that’s more open, that is more flexible knowing that things are going to change because they always do. What’s available to us that’ll give us maximum flexibility in a year or two or five when these new requirements come in that we have not anticipated at this point? BS: Right, because you’re exiting your old systems because they are potentially inflexible. We cannot accommodate anything new. We can sustain what we’re doing indefinitely, but we can’t accommodate this new thing that we need to do. SO: Yeah, it’s interesting because looking at the the techcomm landscape, we have a lot of customers and a lot of just generalized ecosystem that has moved into structured content, and starting as early as the late nineties or maybe even the early nineties in Germany, people were moving into structured content at scale. And now we’re looking at it and saying, “Okay, well there’s all this other content out there and we need to look at that and we need to look at whether we can bring that into the structured content offerings.” But not unreasonably, those other groups are looking at it and pushing back and saying, “This isn’t optimized for the kind of work that I do. It’s optimized for the kind of work that you people do. So how can we improve this and bring it into alignment with what the new and additional stakeholders need?” And it’s a hard problem, I really feel for the software vendors. It’s easy for us sitting here on the services side to say, “Hey, do better,” because we’re not doing the work. BS: Very, very true. And at that point, you have a winner and a loser, and I hate to say it that way, but you have a winner and a loser on the system side at that point. Where you’re pulling one other group in because you have an established structural approach and they could benefit from it, but basically they have to absorb the brunt of the change that’s going to happen, and it’s not necessarily fair. SO: Well, yeah. I mean, life isn’t fair. But also I’ll say that that pain that you’re talking about, the people that are now in structured content, they had that pain. It was just 10 years ago- BS: Very true. SO: …and they’ve forgotten. For those of you that were around and in this industry 10 years ago, or 20 years ago, or 25, I mean, remember what it was like trying to get people to move from you will pry unstructured FrameMaker from my cold, dead hands. You’ll pry Microsoft Word from my cold, dead hands. You will pry PageMaker, Interleaf, Ventura Publisher from my cold, dead hands. BS: WordStar. SO: Okay. So tools come and go, and the tool that is the state-of-the-art, BookMaster, for today is not necessarily the tool that’s going to be state-of-the-art for tomorrow or yesterday. I mean, basically this stuff evolves and we have to evolve with it, and we have to understand what are the best and most reasonable solutions that we can offer to a customer or to a content operations group in order to deliver on the things that they need to deliver on. BS: Very true. So there are no unicorns. SO: No unicorns, or maybe more accurately you can construct your own unicorn and it might be awesome, but it’s going to be a lot of work. BS: So I think we could probably talk about this for hours because there are so many different facets that we can touch upon, but I think we’ll call it done for now, and maybe we’ll see you soon in a new episode? SO: Yeah, if this speaks to you, call us because we’ve barely scratched the surface. BS: All right. Thanks, Sarah. SO: Thanks. BS: And thank you for listening to The Content Strategy Experts podcast brought to you by Scriptorium. For more information, visit Scriptorium.com or check the show notes for relevant links. The post Do enterprise content operations exist? appeared first on Scriptorium .…
C
Content Operations

1 Survive the descent: planning your content ops exit strategy 18:06
18:06
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut18:06
Whether you’re surviving a content operations project or a journey through treacherous caverns, it’s crucial to plan your way out before you begin. In episode 176 of the Content Strategy Experts podcast, Alan Pringle and Christine Cuellar unpack the parallels between navigating horror-filled caves and building a content ops exit strategy. Alan Pringle: When you’re choosing tools, if you end up something that is super proprietary, has its own file formats, and so on, that means it’s probably gonna be harder to extract your content from that system. A good example of this is those of you with Samsung Android phones. You have got this proprietary layer where it may even insert things into your source code that is very particular to that product line. So look at how proprietary your tool or toolchain is and how hard it’s going to be to export. That should be an early question you ask during even the RFP process. How do people get out of your system? I realize that sounds absolutely bat-you-know-what to be telling people to be thinking about something like that when you’re just getting rolling– Christine Cuellar: Appropriate for a cave analogy, right? Alan Pringle: Yes, true. But you should be, you absolutely should be. Related links: Nightmare on ContentOps Street (podcast) Enterprise content operations in action at NetApp (podcast) Content creature feature LinkedIn: Alan Pringle Christine Cuellar Transcript: Disclaimer: This is a machine-generated transcript with edits. Christine Cuellar: Welcome to the content strategy experts podcast brought to you by Scriptorium. Since 1997, Scriptorium has helped companies manage, structure, organize and distribute content in an efficient way. this episode, we’re talking about setting your ContentOps project up for success by starting with the end in mind, or in other words, planning your exit strategy at the beginning of your project. So I’m Christine Cuellar, with me today is Alan Pringle. Hey, Alan. Alan Pringle: Hey there. CC: And I know it can probably sound a bit defeatist to start a project by thinking about the end of the project and getting out of a new process that maybe you’re building from the beginning. So let’s talk a little bit more about that. Why are we talking about exit strategy today? AP: Because everything comes to an end. Every technology, every tool, and we as human beings, we all come to an end. And at some point, you are going to have tools, you’re gonna have technology and process that no longer supports your needs. So if you think about that ahead of time, and you’re ready for that inevitable thing, which will happen, you’re gonna be much better off. CC: Yeah. So this conversation started around the news of the DocBook Technical Committee closing, and that’s kind of a big deal for a lot of people, and it kind of sparked this internal conversation about like, you know, what if that happened to you? How can people avoid getting caught by surprise? And of course, as Alan just mentioned, the answer to that is really to begin with the end in mind, to have an exit strategy because everything does end at some point. So this got me thinking about, you know, I don’t know, Alan, you’ve seen the horror movie The Descent, right? You’ve seen that movie? Yes, because it’s amazing and it’s a horror movie and it’s awesome. So it me kind of think of that because, you know, this group, and I’m not going to spoil it, no spoilers for people who haven’t seen it yet, but, if you haven’t, go watch it. The first one’s my favorite. I haven’t seen the second one, so I’m biased. Anyways, that’s not the point. This group plans to go along one path, you know, down these caves which are definitely in North Carolina, right Alan? That’s definitely where they take place. AP: Well, they say it is in North Carolina, but it is quite clearly not filmed in North Carolina. As someone who is familiar with Western North Carolina, I had to laugh at this movie trying to pass off somewhere in the UK as like the Appalachian Mountains, but that’s just a quibble. So go ahead with your story. CC: Anyways, yeah, they got a mountain in there, right? And then there’s a path into the mountain. Of course, they’re going to explore this deep, dark cave. So they’re descending as the name implies. And so they’re planning to go along one path. think someone maybe tricked someone else along the way. I can’t remember. But they’re planning on going down one path. And there’s a lot of things that begin to happen that they didn’t plan on. And one scene in particular, there’s a cave that collapses and of course that means they have to pivot, right. AP: Yeah. CC: So when you’re thinking about building an exit strategy and trying to plan for things that you can’t anticipate, how do you anticipate things you can’t anticipate? AP: Well, first of all, let’s be clear. All the things that happened in that movie happened in a period of like two hours or an hour and a half. And part of the issue with any kind of process and operations is things can slowly start to go badly and you just kind of keep on trucking and really don’t pay attention to it. But… CC: Yes. AP: It’s not just about fine tuning your operations. That’s a whole other conversation. You your process is going to require updating every once in a while. There going to be new requirements and you need to address them in your content ops by changing your process, updating your tools, maybe adding something new. What we’re talking about here is when those tools and that process, they’re coming to an end, for example, because a particular piece of software is being defecated. It is end of life. What are you going to do? CC: Mm-hmm. AP: What if there is a merger? You have a merger and there are two systems doing the same thing. One of those systems is going to lose and go away. Why are you going to maintain two of the same systems? So you’re going to have to figure out how to pivot to get to that. CC: Mm-hmm. AP: So there are all of these things that can happen that mean you have got to exit whatever you were doing and move into something new, something different. And the reasons are many, like I just mentioned, but the end result is, are you ready for when that happens? In a lot of cases, frankly, people aren’t. CC: Yeah. So if you could give listeners three pieces of advice on how to be less dependent on a particular system, if you had to narrow it down to three, what would you suggest to help them not be just dependent on one particular system or maybe a set of systems? AP: One thing is when you’re choosing tools, if you end up something that is super proprietary, has its own file formats, et cetera, that means it’s probably gonna be harder to extract your content from that system because it is proprietary. Even if your content is in a standard, and in a lot of cases, of course, I’m talking about DITA, the Darwin Information Typing Architecture and XML standard. Even with DITA, even though it’s open source and a standard, some of the systems that can manage DITA content put their own proprietary layer on top. A good example of this is, for example, those of you with Samsung Android phones. I’ve had one in the past. CC: Yeah, that’s me. AP: Samsung puts their own proprietary layer on top of the Android operating system and a lot of that stuff frankly I hate, but that’s not the point of this conversation, but it’s the same issue. You have got this proprietary layer where it may even insert things into your source code that is very particular to that product line. So look at how proprietary your tool is or your toolchain is and how hard is it going to be to export? That should be an early question you ask during even the RFP process. How do people get out of your system? And I realize that sounds absolutely bat, you know what, to be telling people to be thinking about something like that when you’re just getting rolling– CC: Appropriate for a cave analogy, right? AP: Yes, true. But you should be, you absolutely should be. CC: And how do you know you are going to get onto the other two things to think about in just a second, but question there, how do, what are some maybe green flags for how that question should be received or how you want that question to be received if it’s going to maybe be the right fit? AP: I would hope some variation of the answer would be you can export to this standard, although that often is probably not the answer that you’re going to get. CC: Okay, as standard. What are some other things people need to keep in mind in order to not be system-dependent? AP: I don’t know if it’s so much system-dependent, but you need to think culturally about what this means. People become very attached to their tools because they become very adept. They become experts in how to manipulate and do whatever with a certain tool set. And they feel like, you know, I am in total control here. I know what I’m doing. Things are running well. CC: Yeah. AP: And when it turns out that tool is going to have to go away, their entire process and their focus on being an expert, it’s blown. It’s just blown away. And that can be very hard to deal with from a person level, a people level, having to tell people, yeah, this is a shock to your system. You’ve been using this tool forever. You’re really good at it. Unfortunately, that tool is being discontinued. We’re gonna have to move to something else. That can be very hard for people to swallow and it’s understandable. CC: Mm-hmm. AP: It’s completely understandable. One other thing that I will mention is if you can get your source content, not the actual delivery points I’m talking about here, but wherever you’re storing your source in some kind of format neutral, file format and again, talking mostly about XML content, extensible markup language, because when you create that content, you are not building in the formatting. You were creating it as a markup language. And the minute your content is in a markup language, it becomes easy to easier. I shouldn’t say easy because nothing here is easier. There is a better path to moving that content, possibly to another standard, for example, because you can set up a transformation process that’s very programmatic. CC: Mm. Yeah. AP: This particular element in this model becomes this. And when you hit this particular element in this model, you start a new file. If you see this particular attribute, it needs to be moved over here to this attribute. CC: Hmm. AP: So it’s a matching process that you have to do so it can be programmatic. So anytime you get into something that’s XML and what does that X stands for? And what does that X stand for? It stands for extensible. That gives you a little more control because it gives you more flexibility. And that’s weird to think more flexibility gives you more control. That almost seems kind of diametrically opposed, but that’s true. CC: Yeah. AP: Because you can move something out more easily because it is something that can be sliced, diced, transformed. So there’s that angle. CC: Yeah. Yeah. So, okay. So as a non-technical person myself, I’m gonna see if I can summarize this and you tell me whether or not this is accurate. So from a very high level view of this, it’s almost like, you know, rather than keeping all of your content in one particular content management system or something like that, you’re keeping it in a, it’s all stored in a separate box or a separate repository. And then whatever system you’re going to use is your delivery output. It’s almost like a, is that accurate to say? Okay. AP: Because when you are in a format that is not, doesn’t have the, if you’re in a file format that does not have the formatting of your content built in, that means you can deliver to a bunch of different presentation layers. You can automatically apply it. CC: Okay. AP: And that’s really, I was kind of headed that way. You can even see your new system as almost a delivery target, I need to figure out how to transform my source content in a way that a new tool, a new system can understand. And so basically you’re saying, okay, let’s export it, let’s clean it up, maybe do some automated transformations and programming on it to make it more ingestible by the other system. CC: Mm-hmm. AP: So you could even look at this process of moving from one system to another as being really your final destination, another horror movie, your final delivery target, moving that source content into another system that you’re about to use. CC: Yeah. Thank you also for unpacking that because that was much more clear than my example, but that was really helpful. So since people are planning with the end in mind, how far out are we thinking this exit strategy would typically be implemented? How far down the road is this? AP: And that’s the thing, I can’t answer that question because you never know what is going to happen. you, right, mean, it’s like the cave collapse analogy like you mentioned, sometimes you have to take a detour, not of your own choice or of your own making. And again, mergers, tools being discontinued, companies that go under, all of these things can happen. And you need to have a contingency. CC: Mm. Never know. So it’s a contingency plan, really. Yeah. AP: And you need to have a contingency plan in place to get ready to exit. It’s just like during natural disaster season, you hear people say, do you have your emergency preparedness kit ready? It’s a very similar thing, but it’s in the corporate world. This is as much about risk reduction as it is about smooth content operations, at least from my point of view. CC: Yeah. Yeah. And you mentioned several like big things that happen that can trigger the need to, you know, it’s time to exit and move on. Are there any scenarios where there isn’t a big thing that happens like a merger or a business closing or different things like that? Are there more quiet ways where you realize you may not realize that it’s time to exit? But it’s more the need to exit is more subtle. AP: If your content process, your content operations cannot support new business requirements, for example, you need to connect to a new system, you need to deliver your content in another format. If your current system and tools can’t do that, that is a sign you’re probably going to have to find the exit door and find something that will support whatever it is that you cannot do. CC: Mm-hmm. AP: It’s usually you just hit this wall where you realize we have taken this tool and this process as far as it can go. It is time to move on. And here I am going to toot the consultant horn again. But that is when you start getting that uneasy feeling, that’s when you can talk to a consultant who can help you unpack it to see if it’s really a sign that the tool is no longer going to fit you or if there’s something you can do within your current system to make things work. That’s when a third-party point of view can be very valuable. CC: Question for you on that third party perspective, since you’ve seen companies make these transitions many times and exit something and go into a new one, what’s one thing or pitfall that companies need to be aware of that maybe isn’t included in their exit strategy that should be? AP: Something that’s very common is to frame everything you want from your new system from the perspective of what your current system is doing. Even though your current system is not going to do something that you need it to do, you still are so fixated on how it is doing things and you can’t get beyond that. That can be a huge problem. Being able to step back and objectively look. This system can’t do this. CC: Mmm. AP: We need it to do that. And this is how we need to get there. People can get so mired in the, this is how we’re doing things. And we’re going to move over to this new system and do the same exact thing, just in new tools. That’s not a reason to move. There’s some compelling thing that’s forcing you out of that other tool. So now is the time to change things, update things, make some nips and tucks. Maybe undo some things. Don’t just wholesale move over into a new system and keep things status quo. Otherwise, why bother? CC: Yeah, yeah. Is there anything else you can think of when you get to when it’s time to start the exiting process? Anything else that you can think of that companies need to have at the forefront of their mind? AP: It’s the communication. And that includes the vendors and it includes with the people inside the company who are using the tools. And I would also mention it includes procurement. They need to understand the wins, the whys, why you’re having problems, all that, because there can be contractual obligations about when a license ends and another one begins. So you’ve got to keep that information flowing to all kinds of parties to make this exit, this transition work well. CC: Yeah, you want it to end like the American version of The Descent where the hero actually gets out and drives away in the car, not like the UK version where the person is still stuck in the cave, which is the better ending for a horror movie, I will say, but not for your content ops project. Definitely. AP: Yeah, but at least in a content ops project, you’re not going to get eaten by some humanoid blind thing living at a cave. CC: Hopefully, right? That’s ideal. That’s the best case scenario. AP: Hopefully not. Yeah. CC: Well, Alan, is there any other parting advice you can think of before we wrap up today’s topic? AP: Don’t go into a cave unprepared. Okay? Just don’t. How’s that? CC: Yeah, don’t yeah that that is actually good advice. Yeah, don’t go unprepared. That’s really helpful. And like Alan mentioned earlier a third party perspective. I know it’s very biased to be saying it but a third party perspective when it’s time to either make the exit transition or plan for the exit transition. Content strategists can really help with that because we’ve seen we’ve seen a lot of things a lot of caves. Yes. Yeah. AP: A lot. Maybe not cave dwellers, but a lot. CC: Hopefully, hopefully no one has actually seen those. Yeah, well, thank you so much for being here, Alan. I really appreciate you talking about this with me today. And thank you for listening to the Content Strategy Experts Podcast brought to you by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. The post Survive the descent: planning your content ops exit strategy appeared first on Scriptorium .…
C
Content Operations

1 Enterprise content operations in action at NetApp (podcast) 23:10
23:10
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut23:10
Are you looking for real-world examples of enterprise content operations in action? Join Sarah O’Keefe and special guest Adam Newton, Senior Director of Globalization, Product Documentation, & Business Process Automation at NetApp for episode 175 of The Content Strategy Experts podcast. Hear insights from NetApp’s journey to enterprise-level publishing, lessons learned from leading-edge GenAI tool development, and more. We have writers in our authoring environment who are not writers by nature or bias. They’re subject matter experts. And they’re in our system and generating content. That was about joining us in our environment, reap the benefits of multi-language output, reap the benefits of fast updates, reap the benefits of being able to deliver a web-like experience as opposed to a PDF. But what I think we’ve found now is that this is a data project. This generative AI assistant has changed my thinking about what my team does. Yes, on one level, we have a team of writers devoted to producing the docs. But in another way, you can look at it and say, well, we’re a data engine. — Adam Newton Related links: NetApp Product Documentation featuring Doc, NetApp’s GenAI-powered assistant AI in the content lifecycle Technical debt in content operations The business case for content operations LinkedIn: Adam Newton Sarah O’Keefe Transcript: Disclaimer: This is a machine-generated transcript with edits. Sarah O’Keefe: Welcome to the content strategy experts podcast brought to you by Scriptorium. Since 1997, Scriptorium has helped companies manage structure, organize and distribute content in an efficient way. In this episode, we talk about content operations with Adam Newton. Adam is the senior director of global content experience services at NetApp. Hi everyone, I’m Sarah O ‘Keefe. Adam, welcome. Adam Newton: Hey there, how are you doing, Sarah? SO: It’s good to see and/or hear you. AN: Good to hear your voice. SO: Yeah, Adam and I go way back, which you may discover as we go through this podcast. And as those of you that listen to the podcast know, we talk a lot about content ops. So what I wanted to do was bring somebody in that is doing content ops in the real world, as opposed to as a consultant.and ask you, Adam, about your perspective as the director of a pretty good-sized group that’s doing content and content operations and content strategy and all the rest of us. So tell us a little bit about NetApp and your role there. AN: Sure. So NetApp is a Fortune 500 company. We have probably close to 11,000 or more global employees. Our business is primarily data infrastructure, storage management, both on-prem. We sell storage operating system called ONTAP. We sell hardware storage devices, and we are most importantly, think, at this day and age, integrating with Azure, Google Cloud Platform, and AWS on first -party hyperscaler partnerships. My team at DENAP is… I actually have three teams under me. The largest of those three teams is the technical publications team. The other two teams globalization responsible for localization translation of both collateral and product. And then finally, and most new to my team is our digital content science team, which is our data science wing. Have about 50 to 53, think, employees at this point in my organization and all told probably about a hundred with our vendor partners. SO: And so I think we all have a decent idea of what the technical publications team and the globalization teams do. Can you talk a little bit about the data science side? What does that team up to? AN: Yeah, that’s a thank you for asking that question. So about two years ago, I was faced with an opportunity to hire. And maybe some of your listeners who are managers are familiar with that situation, right? I hope they are, rather than not being able to hire. I took a moment and thought a little bit more about what I needed in the future. And I thought a little bit differently about roles and responsibilities, opportunities inside NetApp and the broader content world and decided to bring in a data scientist. And then I thought a little bit more about, well, there are other data scientists at NetApp. Why would I need one? And I thought a little bit about the typical profile of the data scientists at that time at NetApp, mostly in IT and other product teams. Those data scientists were primarily quantitative data scientists coming from computer science backgrounds. And I thought, well, you know, we’re in the content business. I want to find a data scientist who is a content specialist and who has a background in the humanities and who also has skills in core data science skills, emphasizing, for example, NLP. And so that was my quest. And I was very, very fortunate to find a PhD candidate in English who wanted to get out of the academy and who had these skills. And it’s been an incredible boon to our organization. We’ve even hired a second PhD in English recently. And Sarah, since you and I are friends, I’ll say one was from UNC and one was from Duke. Okay. So we don’t have to have that discussion here. I’m an equal opportunity person. Although I did hire the UNC one first, Sarah. SO: I see, I see. So for those of you that don’t live in North Carolina, this is… I’m not sure there is a comparison, but it is important to have both on your team. And I appreciate your inclusion of everybody. It is kind of like… I’ve got nothing. AN: Yes. SO: Okay, so you hired some data scientists from a couple of good universities. Or do they get along? Do they talk to each other? AN: Fabulously, yes. No petty grievances. SO: Okay, just checking. All right. So how do you, in this context then, what does your environment look like? What kinds of things are you doing with the docs team? And what’s the news from NetApp docs? AN: So maybe a little bit of background actually, and you and I have talked about this previously, but we used to be a data shop. And then as things sped up inside our business with the adoption and development of cloud services at NetApp, we found that some of the apparatus of our data infrastructure, our past practices weren’t able to keep up to speed of the cloud services that were being developed. I think this is actually, I’ve talked to other people in our business, this is a very common situation. We handled it in one way. There are many ways to handle it, but the way we chose to handle it was to exit data and to move in our source format anyway to a format called ASCII doc, which I always frequently describe as a dialect of markdown. And we went from being a closed system of technical writers working inside a closed CMS to adopting open source. We now work in GitHub. Our pipeline is all open source and we have now contributors to our content that are not technical writers. In some cases, they’re technical marketing engineers, solution architects, and so forth as well as a pipeline of docs that we build through automations where we, for example, transform API specifications or reference docs that are maintained by developers and output those into our own website docs.netapp.com. In addition to just the docs part, my globalization team has been using for many years, machine translation. So speaking to one particular opportunity of being in one organization, when we output our docs and whenever we update our docs in English, they’re automagically updated in eight other languages and published to docs.netapp.com. So we roughly maintain 150,000 English files and you can times those by eight. Is that right? Did I do the math right? Yeah. SO: Or nine, depending. AN: Nine. Yeah. Is English the language? Yeah, sure. Let’s count it. SO: Depends on how we use it. Okay, so you have an ASCII doc, you know, Markdown-ish. Is it fair to call it Docs as Code environment? AN: So we often describe it as a content ops, environment. I’m not sure if that is, different from Docs as Code, but I think maybe I will accept that as a reasonable description in the sense that, we have asked our team members to think about the content that they’re writing as highly structured, semantically meaningful units of information. I think in the same way I think a developer can be asked to think of their code being that way and the systems in which we write in VS code, many engineers are writing in that. SO: Mm-hmm. AN: And of course our source files, as I mentioned, all in our automation and our pipelines are all based on being in GitHub. SO: And so then you’ve got docs.netapp.com as a portal or a platform where a lot of this content goes. And what’s happening over there? Do you have any news on new things you’ve done there? AN: Yeah. I mean, very recently, you know, the timing of this is really interesting. We, have been working on a generative AI solution, for a year, Sarah. you’ll recall the, the hype, right? When, when chat GPT exploded onto the, the, into the public consciousness, right? Through the media and, shortly thereafter, we began imagining what it might look like to leverage that technology, those types of technologies to deliver a different customer experience. And we identified a chatbot as being something we thought could add to the browse and search experiences on docs .netapp .com. And we just released that on the 20th of August announced it here internally inside of NetApp on the 27th. So we are literally like 48, 72 hours into a public adventure here. SO: I take full credit for planning it, even though I knew nothing about any of this. AN: Yeah. And that was a long time. I think it’s worth noting too. It was a long time. And I think it’s beyond the full dimensions of this, this discussion to talk about why it took so long. But I will say maybe to, you know, the, were early adopters and we felt, we felt the pain and the benefit of being that, you know, it was like, you know, changing the tires on a, on a race car, right? That was speeding around the track. So we had to learn and be responsive and also humble in the sense that there were some missteps that we had to recover from and some magical thinking, I think, at the beginning of the project that was qualified more over the course of the project. SO: And so what does that GenAI solution sitting in or over the top of the docs content set, what does that do in terms of your authoring process? Do you have any, are there any changes on the backend as you’re creating this content that is then consumed by the AI? AN: I would say we’re in the process of understanding the full implications of having this new output surface, this generative AI assistant, and fully grappling with what the implications are for the writers. We find ourselves frequently in discussions about audience. And audience is all those humans that we have been writing for and a whole bunch of machines that we now need to think more consciously about, you know, and it’s, we find ourselves often talking about standards and style, but not just from the perspective of, you know, writing the docs in a consistently patterned way for humans to be able to consume well, but also because patterns and machines are a marriage made in heaven. And we see actually opportunities to begin to think of the content we’re writing as a data set that needs to be more highly patterned and predictable so that a machine can consume it and algorithmically and probabilistically decide how to generate content from the content we’re creating. SO: And where is this going in terms of what’s next as you’re looking at this? I think you mentioned that there’s other opportunities potentially to add more data slash content. AN: Yeah, actually, if I back up to a detail and I shared, but maybe quickly, you know, we do have writers in our authoring environment who are not writers. They are by nature and by bias sort of, they’re, people who have their subject matter experts, right? And they’re in our system and they’re generating content. But I think that some of the opportunities that, so that was about join us in our environment, right? Join us in our environment, reap the benefits of multi-language output, reap the benefits of fast updates, reap the benefits of being able to deliver a web-like experience as opposed to a PDF. But what I think we’ve found now is that this is a data project. This generative AI assistant has changed my thinking about what my team does. And I think, yes, on one level, true. Yes, we have a team of writers and there’s a big factory devoted to producing the docs. But in another way, you can look at it and say, well, we’re a data engine. We own a large, own, maintain a large data set and the GenAI is one consumer of that data set. But we’re also thinking about our data set as being joinable to other data sets inside of NetApp. And in particular, I work inside the chief design office at NetApp, along with UX researchers and designers. And we’re also more broadly part of our platform team at NetApp, shared platform team. So we’re thinking about how might we join our data with other teams’ data to create in-product experiences that are data-led or data-driven in combination with curated experience. So if your viewers were to be able to see me, I am waving my hand a little bit, not because I’m dissembling, but more because I’m aspiring. And I think there’s a really, really cool future ahead for, a way, Sarah, that I think is super energizing for the writers, right? To see that their work is being reframed, not replaced or changed, right? The fear of writers with GenAI, right, of being replaced. Well, I would offer this as an example of, you know, maybe it’s not such a dismal view and maybe in fact there’s a very interesting future if you reframe your thinking about what you do and the opportunities to join what you do to create different experiences. SO: And I think it’s an interesting perspective to look at GenAI as being a consumer of the content slash data that you’re putting out. A lot of the initial stuff was, this is great. GenAI will just replace all the tech writers. You’re talking about something entirely different. AN: I guess I wanted to expand on that because I think we’re actually now hovering on a really important point. You know, what is your mindset? You know, what what how are you thinking about this moment in time? The broad we write you or the broader you us generally write who are in this industry. And, you know, I think we don’t see a great indication that GenAI can create net new content and do it well, honestly. I think you can write it summarizing, it can make your day-to-day, your meeting notes and so forth, Microsoft Co-pilot, right? There are some great uses, but I have not seen convincing, compelling indicators that docs can be written by, at least at the enterprise level, right? Our products are complex. We often talk about our writers as sense makers, right? And I think that we can take advantage of GenAI in the right ways. And I think this is one of the ways that we’re taking advantage of it, which is to give customers another experience. And frankly, also for us to learn a lot about what people are asking and assuming and we can learn a lot and continuously improve. SO: So what’s happening on the delivery side? Somebody asks for some sort of information and it gives either, it says it doesn’t exist or it gives an incorrect response. Are you seeing any patterns there? What are you doing with that? AN: Yeah, many of your listeners might have produced products themselves, right, or delivered products themselves and remembered what happens in the first day or two of releasing a product, right? So the timing of this chat is really good. Yeah, in the last couple days we’ve seen I was just talking to a data scientist on my team and I was saying, you know, what I think I see here emerging as a possible pattern is that people don’t actually know how to use these things effectively. That, you know, they ask of it questions that it really could never answer, or they don’t fully understand the constraints of the system, meaning that, well, it’s only based on a certain data set. you know, they don’t know that the data set doesn’t include the data they’re looking for, right? Because it sits somewhere else. You know, we’re modifying our processes to intake feedback. I think there’s a real interesting nexus is, is it the AI or is it the content? That’s the really interesting one, right? You know, was the content ambiguous, deficient, duplicitous, whatever, you know, is that a word? SO: It is now. AN: At UNC we use that word, not at Duke. But it is an interesting discussion inside our organization when we receive a piece of feedback, what’s causing it? Is it the interpretive engine or is it our source? And so we’re seeing a lot of gaps in our content, it’s exposing a lot of gaps or other suboptimal implementations. SO: I mean, we’ve said that in a sort of glib manner, because of course you’re living this day to day and hour by hour, but we’ve said that, know, GenAI sitting over the top of a content set is going to uncover all your inconsistencies, all your missing pieces, all your, you know, over here you said update and over here you said upgrade. That was an example I heard from someone else. And so it basically uncovers your technical debt. AN: Yeah, beautiful. Yeah, bingo. Yeah. Yeah. Yeah. You’re so right there. Terminology, right? my God. Can you believe how many things, how many ways we’ve talked to, talked about X, right? SO: Right, and the GenAI thinks they’re different because, or it doesn’t think anything right, but the pattern isn’t there and so it doesn’t associate those things necessarily. AN: Yeah, your listeners may commiserate with this, or the use of words as verbs and nouns, like cable. We often in our documentation talk about cabling devices. How would a GenAI know that the writer of the question is using cable as a verb or noun? SO: Mm-hmm. So as you’re working through this and with your, you know, it sounds like two days of go live plus a year or two or three of suffering and a year and two days. AN: Well, a year and two days, a year and two days. SO: You know, I think you’re further along than lot of other organizations. Do you have any advice for those that are just beginning this journey and just looking at these kinds of issues? What are the things you did best or maybe worst or would do the same way or not? What’s out there that you can tell people that’ll maybe keep them from, you know, get them, get them or help them as they move forward? AN: Yeah, but maybe think of it in the old people process systems dimensions. Actually, taking that latter one, systems, I would say beware the fascination of the system without thinking more about the processes and people that are going to be involved in the creation of some kind of generative AI solution. I think, you know, this is as much of an adaptive people process as it is a problem as it is a technical problem. Probably more frankly on the adaptive. And from a process perspective, I’d say, be curious about what you learn. Be attentive to the specifics, but look for the broad patterns in the feedback or what you’re seeing as you develop these solutions, you know, for me, I think I hinted at this before and I think it for me has been frankly, the epiphany of the project. There have been many, but I’d say I I would really highlight this one, which is what does my team do? What is the value of what they generate? And for me, yes, we are, you know, primarily a team that creates documentation, but you know, holy smokes, you know, the, the idea that we are data owners, and we govern a massive, semantically rich, non-determinant, fast-changing data set, that is super, super interesting. Even here inside NetApp, Sarah, we have teams reaching out to us who frankly before probably never thought about the docs. And all of a sudden, because we have this huge data set, they’re like, wow, we can, you know stress test our system or our new technologies using what they have. That’s a super cool moment for our team. SO: Yeah, I think you’re the first person that I’ve heard describe this sort of context shift from this is content to this is data or this content is also data or however you want to phrase that. But I think that’s a really interesting point and opens up a lot of fascinating possibilities, not least for the English PhDs of the world. That’s super helpful. AN: Is this where I confessed at one time trying to think I was going to be one of those and I got out because I realized I was terrible at it? SO: No, no, no, that goes in the non-recorded part of the podcast. Yeah, I’m going to wrap it up there before Adam spills all of the dirt. AN: Yeah, what am I compensating for, right? SO: But thank you, because this is really, really interesting. And I think it will be helpful to the people listening to this podcast, because it’s so rare to get that inside view of what it really looks like and what’s really going on inside some of these bigger organizations as you move towards AI, GenAI strategies and figure out how best to leverage that. So thank you, Adam. And it’s great to see you. AN: No, Sarah, thank you. And actually, I would like to thank my team. I mean, it has been an incredible adventure, and I think the team is really amazing. SO: Yeah, and I know a few of them and they are great. So with that, thank you for listening to the Content Strategy Experts Podcast brought to you by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. The post Enterprise content operations in action at NetApp (podcast) appeared first on Scriptorium .…
C
Content Operations

1 Position enterprise content operations for success (podcast) 19:46
19:46
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut19:46
In episode 174 of The Content Strategy Experts podcast, Sarah O’Keefe and Alan Pringle explore the mindset shifts that are needed to elevate your organization’s content operations to the enterprise level. If you’re in a desktop tool and everything’s working and you’re happy and you’re delivering what you’re supposed to deliver and basically it ain’t broken, then don’t fix it. You are done. What we’re talking about here is, okay, for those of you that are not in a good place, you need to level up. You need to move into structured content. You need to have a content ops organization that’s going to support that. What’s your next step to deliver at the enterprise level? — Sarah O’Keefe Related links: The business case for content operations Do you have efficient content ops? Technical debt in content operations Content Ops Forecast: Mostly Sunny With A Chance Of Chaos (webinar) LinkedIn: Sarah O’Keefe Alan Pringle Transcript: Disclaimer: This is a machine-generated transcript with edits. Alan Pringle: Welcome to the Content Strategy Experts Podcast brought to you by Scriptorium. Since 1997, Scriptorium has helped companies manage, structure, organize, and distribute content in an efficient way. In this episode, we talk about setting up your content operations for success. Hey everyone, I am Alan Pringle and I am back here with Sarah O ‘Keefe in yet another podcast episode today. Hello, Sarah. Sarah O’Keefe: Hey there. AP: Sarah and I have been chatting about this issue. It’s kind of been this nebulous thing floating around and we’re gonna try to nail it down a little bit more in this conversation today. This idea of setting up your organization for success and their content operations. And to start the conversation, let’s just put it out there. Let’s define content ops. What are content operations, Sarah? SO: Content strategy is the plan. What are we going to do, how do we want to approach it? Content ops is the system that puts all of that in place. And the reason that content ops these days is a big topic of conversation is because content ops in sort of a desktop world is, well, we’re going to buy this tool, and then we’re going to build some templates, and then we’re going to use them consistently. And the end, right? That’s pretty straightforward. But content operations in a modern content production environment means that we’re talking about lot of different kinds of automation and integration. So the tools are getting bigger, they’re scarier, they’re more enterprise level as opposed to a little desktop thing. And configuring a component content management system, connecting it to your web CMS and feeding the content that you’re generating in your CCMS, your component content management system, into other systems via some sort of an API is a whole different kettle of fish than dealing with, you know, your basic old school unstructured authoring tool. So yeah. AP: Right. But in their defense, for the people who are using desktop publishing, that is still content operations. SO: Sure, it is. AP: It’s just a different flavor of content operations. And frankly, a lot of people, a lot of companies and organizations outgrow it, which is why they’re going to this next level that you’re talking about. SO: Right. So if you’re in a desktop tool and everything’s working and you’re happy and you’re delivering what you’re supposed to deliver and basically it ain’t broken, then don’t fix it. You are done. You should shut off this podcast and go do something more fun with your time. Right? What we’re talking about here is, okay, for those of you that are not in a good place, you need to level up. You need to move into structured content. You need to have a content ops organization that’s going to support that. What do you do? What’s your, you know, what’s your next step and what does it look like to organize this project in such a way that you move into, you know, that next level up and you can deliver all the things that you’re required to deliver in the bigger enterprise, whatever you want to call that level of things. So desktop people, I’m slightly jealous of you because it’s all working and you’re in great shape and good for you. I’m happy for you. AP: So making this shift from content operations and desktop publishing to something more enterprise level like you’re talking about, that is a huge mind shift. is also technically something that can be quite the shock to the system. How do you go about making that leap? SO: Well, I’m reminded of a safety announcement I heard on a plane one time where they were talking about how, you know, when you open the overhead bins after landing, you want to be careful. And the flight attendant said, shift happens. And we all just looked at her like, did you actually just say that? And she sort of smirked. So making this shift can be, it’s can be, it’s difficult, right? And what we’re usually looking at is, okay, you’ve been using, you know, Word for the past 10, 15, 20, 57 years. And now we need to move out of that into, you know, something structured XML, maybe it’s DITA, and then get that all up and running. And so what’s going to happen is that you have to think pretty carefully about what does it look like to build the system and what does it look like to sustain it? Now here I’m talking particularly to large companies because what we find is the outcome in the end, right, when this is all said and done and everything’s up and running and working, what you’re probably going to have is some sort of an organization that’s responsible for sustainment of your content ops. So you’re to have a content ops group of some sort, and they’re going to do things like run the CCMS and build new publishing pipelines and keep the integrations moving and help train the authors. And in some cases, they’re kind of a services organization in the sense that you have an extended group of maybe hundreds of authors who are never going to move into structured content. So you’re taking on the, again, word content that they are producing, but you’re moving it into the structured content system as a service, like an ingestion or migration service to your larger staff or employee population. Okay, so in the future world, you have this group that knows all the things and knows how to keep everything running and knows how to kind of manage that and maintain it and do that work. And probably in there, you have an information architect who’s thinking about how to organize content, how to classify and label things, how to make sure the semantics, you know, the actual element tags are good and all that stuff. But right now, you’re sitting in desktop authoring land with a bunch of people that are really good at using whatever your desktop authoring tool may be. And you have to sort of cross that chasm over to, now we’re this content ops organization with structured content, probably a component content management system. So what I would probably look at here is, you know, what is the outcome? You know, thinking about the system has stood up, we’ve made our tool selection, everything’s working, everything’s configured, everything’s great. What does it look like to have an organization that’s responsible for sustaining that? And that could be, you know, two or three or 10 people, depending on the size, again, the size and scope of your organization and the content that you’re supporting. But in order to get there, you first have to get it all set up. You have to do the work to get it all up and running. Our job typically is that we get brought in to make that transition. Right? So we’re not going to be for a large organization, we’re not going to be your permanent content ops organization. We might provide some support on the side, but you’re going to have people in-house that are going to do that. They’re going to be presumably full-time permanent kind of staff members. They know your content and your domain and they have expertise in, you know, whatever your industry may be. AP: Right. SO: Our job is to get you there as fast as possible. So we get brought in to do that setting up piece, right? What are the best systems? What are the things you need to be evaluating? What are the weird requirements that you have that other organizations don’t have that are going to affect your decisions around systems and for that matter, people, right? Are you regulated? What is the risk level of this content? How many languages are you translating into? What kind of deliverables do you have? What kind of integration requirements do you have? And when I say integration, to be more specific, maybe you’re an industrial company and so you have tasks, service, maintenance kinds of things, and you need those tasks like how to replace a battery or how to swap out breaks to be in your service management system so that a field service tech can look at their assignments for the day, which are, you know, go here and do this repair and go here and do this maintenance. And then it gets connected to, and here’s the task you need and here’s the list of tools you need. And here are all the pieces and parts you need in order to do that job correctly. Diagnostic troubleshooting systems. You might have a chat bot and you want to feed all your content into the chat bot so that it can interact with customers. You may have a tech support organization that needs all this content and they want it in their system and not in whatever system you’re delivering. So we get into all these questions around where does this content go? You know, where does it have tentacles into your organization and what other things do we need to connect it to and how are we going to do that? So I think it’s very helpful to look at the upfront effort of configure or, you know, making decisions, deciding on designing your system and setting up your system versus sustaining, enabling, and supporting the system. AP: There are lots of layers that you just talked about and lots of steps. It is very unusual, at least in my experience, to find someone, some kind of personnel resource, either within or hiring, who is going to have all of the things that you just mentioned because it is a lot to expect one person to have all of that knowledge, especially if you are moving to a new system, and you’ve got a situation where the current people are well versed in what is happening right now in that infrastructure, that ecosystem. To expect them to magically shift their brain and figure out new things, that’s a lot to ask for. And I think that’s where having this third-party consultant person, voice, is very helpful because we can help you narrow in on the things that are better fits for what you’ve got going on now and what you anticipate coming in the future. SO: Yeah, I mean, the thing is that what you want from your internal organization is the sustainability. But in order to get there, you have to actually build the system, right? And nearly always when people reach out to us and say, we’re making this transition, we’re interested, we’re thinking about it, et cetera, they’re doing it because they have a serious problem of some sort. We are going into Europe and we have no localization capabilities or we have them, but we’ve been doing, you know, a little bit of French for Canada and a tiny bit of Spanish for Mexico. And now we’re being told about all these languages that we have to support for the European Union. And we can’t possibly scale our, you know, 2 .5 languages up to 28. It just, it just can’t be done. We’ll, we’ll drown. Or people say, We have all these new requirements and we can’t get there. We’ve been told to take our content that’s locked into, you know, page based PDF, whatever, and we’re being required to deliver it, not just onto the website and not just into HTML, as you know, content as a service, as an API deliverable, as micro content, all this stuff. And they just, they just can’t, you can’t get there from here. And so you have people on the inside who understand, as you said, the current system really well, and understand the needs of the organization in the sense of these things that they’re being asked to do and they understand the domain. They understand their particular product set internally. But it’s just completely unreasonable to ask them to stand up, support and sustain a new system with new technology while still delivering the existing content because, you know, that doesn’t go away. You can’t just push the pause button for five months. AP: No, the real world does not stop when you are going on some kind of huge digital transformation project like one of these content ops projects. So basically what we’re talking about here, especially on the front end, the planning discovery side, is we can help augment, help you focus. And then once you kind of picked your tools and you start setting things up, there’s some choices there that sometimes have to do with like the size of an organization about how to proceed with implementation and then maintenance beyond that. Let’s focus on that a little bit. SO: Most of the organizations we deal with are quite large. Actually, all of the organizations we deal with are quite large compared to us, right? It’s just a matter of are they a lot bigger or are they a lot, a lot, a lot, lot bigger? AP: Correct. SO: Within that, the question becomes how much help do you want from us and how much help do your people need in order to level up and get to the point where they can be self-sufficient? We have a lot of projects we do where we come in and we help with that sort of big hump of work, that big implementation push, and help get it done. And then once you go into sustainment or maintenance mode, it’s 10% of the effort or something like that. And so either you staff that internally as you’re building out your organization internally, or we stick around in sort of a fractional, smaller role to help with that. The pendulum kind of shifted on this for a while, or way back, way back when it was get in, do the work and get out. We rarely had ongoing maintenance support. Then for a bit, we were doing a lot of maintenance relative to the prior efforts. And now it feels as though we’re seeing a shift in a little bit of a shift back to doing this internally. Organizations that are big enough to have staff like a content ops group or a content ops person are bringing it back in-house instead of offloading it onto somebody like us. We’re happy to do whatever makes the most sense for the organization. At a certain size, my advice is always to bring this in-house because ultimately, your long-term staff member who has domain expertise on your products and your world and your corporate culture and has social capital within your organization will be more effective than offloading it onto an external organization, no matter how great we are. AP: To wrap up, think I want to touch on one last thing here, and that’s change management. And yes, we beat that drum all the time in these conversations on this podcast, but I don’t think we can overstate how important it is to keep those communication channels open and be sure everyone understands what’s going on and why you’re doing what you’re doing. What we’ve talked about so far is very much, okay, we’ve come up with a technical plan, we’ve done a technical implementation, and now we’re going to set it up for success and maintain it for the long haul and adjust it as we need to as things change. But there are still a group of people who have to use those tools, your content creators, your reviewers, all of those people, your subject matter experts, I mean, I can go on and on here, they are still part of this equation here and we can’t forget about them while we’re so focused on the technical aspects of things. SO: I would say this and directly to the people that are doing the work, know, the authors, the subject matter experts, the people operating within the system. I would look at this as an opportunity. It is an opportunity for you to pick up a whole bunch of new skills, new tools, new technologies, new ways of working. And while I know it’s going to be uncomfortable and difficult and occasionally very annoying as you discover that the new tools do some things really well, but the things that were easy in the old tools are now difficult, right? There’s just going to be that thing where the expertise you had in old tool A is no longer relevant and you have to sort of learn everything all over again, which is super, super annoying. But it’s fodder for your resume, right? I mean, if it comes to it, you’re going to have better skills and you’re going to have another set of tools and you’re going to be able to say, yes, I do know how to do that. So I think that just from a self-preservation point of view, it makes a whole lot of sense to get involved in some of these projects and move them forward because it’s going to help you in the long run, whether you stay at that organization or whether you move on to somewhere else, you know, at some point in the future. That’s one of the ways I would look at this. It is certainly true that the change falls on the authors, right? AP: Correct. SO: They all have to change how they work and learn new ways of working and there’s a lot there and I don’t want to you know sort of sweep that aside because it can be very painful. We try to advocate for making sure that authors have time to learn the new thing that people acknowledge that they’re not going to be as productive day one in the new system as they were in the old system that they know inside out and upside down that they get training and knowledge transfer and just, you a little bit of space to take on this new thing and understand it and get to a point where they use it well. So I think there’s a, you know, there’s a combination of things there. For those of you that are leading these projects, it is not reasonable, again, to stand the thing up and say, go live is Monday. So, you know, I expect deliverables on Tuesday. That is not okay. AP: Yeah. And you’ve just wasted a ton of money and effort because you’ve thrown a tool at people who don’t know how to use it. So all of your beautiful setup kind of goes to waste. So there are lot of options here as far as making sure that your content ops do succeed. And I don’t think it’s like pretty much everything else in consulting land. It is not one size fits all. SO: It depends, as always. We should just generate one podcast and put different titles on it and just say it depends over and over again. AP: Pretty much, we’d probably just get an MP3 of us saying that phrase over and over again and just loop it and that will be a podcast episode. And on that not-great suggestion for our next episode, I’m gonna wrap this up. So thank you, Sarah. SO: Thank you. AP: I think she just choked on her tea, everyone. SO: I did. AP: Thank you for listening to the Content Strategy Experts Podcast brought to you by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. The post Position enterprise content operations for success (podcast) appeared first on Scriptorium .…
C
Content Operations

1 Conquering content localization: strategies for success (podcast) 19:23
19:23
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut19:23
Translation troubles? This podcast is for you! In episode 173 of The Content Strategy Experts podcast, Bill Swallow and special guest Mike McDermott, Director of Language Services at MadTranslations, share strategies for overcoming common content localization challenges and unlocking new market opportunities. Mike McDermott: It gets very cumbersome to continually do these manual steps to get to a translation update. Once the authoring is done, ideally you just send it right through translation and the process starts. Bill Swallow: So from an agile point of view, I am assuming that you’re talking about not necessarily translating an entire publication from page one to page 300, but you’re saying as soon as a particular chunk of content is done and “blessed,” let’s say, by reviewers in the native language, then it can immediately go off to translation even if other portions are still in progress. Mike McDermott: Exactly. That’s what working in this semantic content and these types of environments will do for a content creator. You don’t need to wait for the final piece of content to be finalized to get things into translation. Related links: MadCap Software MadTranslations Accelerate global growth with a content localization strategy (podcast) Lost in translation? Create scalable content localization processes LinkedIn: Bill Swallow Mike McDermott Transcript: Disclaimer: This is a machine-generated transcript with edits. Bill Swallow: Welcome to the Content Strategy Experts podcast, brought to you by Scriptorium. Since 1997, Scriptorium has helped companies manage, structure, organize, and distribute content in an efficient way. In this episode, we explore strategies for conquering localization challenges, and unlocking new market opportunities. Hi everybody. I’m Bill Swallow, and with me today is Mike McDermott from MadCap Software. Hey Mike. Mike McDermott: Hi Bill. BS: So before we jump in, Mike, would you like to provide a little background information about you, who you are, what you do at MadCap? MM: Sure. My name is Mike McDermott. I am the director of language services at MadCap Software working with our MadTranslation Group. And we support companies that work in single source authoring in multichannel publishing tools like those offered from MadCap Software for IXIA and MadCap Flare and Xyleme and other tools. BS: So Mike, what are some of the challenges you’ve seen and what works for overcoming some of these localization challenges? MM: One of the main challenges I see with companies that come to us, and they typically come to us because they’re looking at working in an XML-based authoring tool and they’re curious about the advantages it has for translation. And one of the biggest challenges I see initially with these companies is just figuring out what content needs to go into translation when you’re working in different types of tools. And one of the ways I see to solve that problem is working in a tool where you have the ability to tag certain content and identify content for different audiences or different purposes. It just makes it simpler to identify that content and get it straight into translation and removes a lot of the human error around packaging up content and trying to figure out yourself what files, house texts that might be translatable for whatever the output is that you’re looking to build. So just working in those tools I see inherently helps with translation because it helps you identify exactly what needs to be translated and it gets it into translation much quicker. BS: So I think we’re talking about semantic content there and making sure that you have all the right metadata in place so that you can identify the correct audience, the correct, let’s say versions of the product, whether to translate or not, and any other relevant information about the content. So you’re able to isolate the very specific bits of content that need to be translated and omit a lot of the content that necessarily isn’t needed for that deliverable. MM: Exactly, Bill. It lets the technology tell you what needs to be translated in what houses text versus you trying to go through a file list and determine what do I need to send out to a translator to translate. The flip side of that is to just send everything for translation, but it’s very rare that anything in any given project for any type of system is going to need to be translated. So by tagging it in that way, you can quickly get into the translation and get things moving. And what I see happening at the end of these projects, oftentimes when you’re not working in those types of systems is you end up finding bits and pieces of content or different files that ended up needing to be translated that missed that initial pass. Now they have to go back through translation and you’re delayed. So just getting everything right the first time and relying on the tools to tell you exactly what needs to be translated by looking up metadata or different tags just simplifies the process and speeds everything up, helps translation get done quicker and just improves time to market for the end user to get their content out. BS: So it sounds like it reduces a good amount of friction, especially with regard to finding missing bits and pieces that should have been translated and weren’t, and then needing to go back and make sure that’s done in time. What are some other ways that people can reduce friction in their translation workflow? MM: Well, a big emphasis for us over the past few years around removing friction is working with connectors and different technologies that can orchestrate the translation process. So we can automate a lot of this and remove the bottlenecks around someone having to, like I said before, manually go into a set of files and package things up for a translator, zip up files, upload them to different locations, and they just get passed around and things can happen when working that way, even outside of just missing files. So working with connectors and these technologies that can connect directly into these systems and get the text right into translation, removing all those friction points just eliminates a lot of room for error in project delays, bottlenecks for tasks that can be easily handled by modern technology. BS: And I assume that there’s probably some technology there as well that kind of govern other things, other parts of the workflow, like review, content validation, that type of thing? MM: Exactly, exactly. So we’re trying to automate the flow of data into the different points in translation and then get the content ready. For example, for reviewers, you mentioned reviewers. So once content gets into translation, we can get it right into the translation system from the authoring environment that the customer’s working in, get it into translation. And as soon as the translation is done, a human reviewer on the client side or on our side or whoever can be notified that this content is ready for translation and it just helps keep things moving. So now it’s on them to complete their translation. And once that’s done, the process can continue on and the automated QA checks, the human QA checks can be done at that point, and then the project can be pushed back to wherever it needs to go and put into publication. But by automating the steps and plugging in the humans where they provide the most value, it just removes the time costs in error-prone steps that don’t need to be there. BS: So it sounds like a lot of it does come down to saving a good deal of time. I would also imagine that these types of workflows, they also help streamline a lot of the publishing needs that come after the translation as well. MM: Correct. And that’s kind of why we started MadTranslation when we did, was to provide our customers a place to go to work with the translation agency that understood these tools and understand how these bits and pieces come together to build an output. We put it together to provide our customers a turnkey solution where they can get a working project back where they can quickly get into publication. By removing the friction points and using modern technology to automate a lot of these processes, we’re able to get things into translation and add a translation into the final deliverable much faster. So once that happens, we can build the outputs and we can check if it requires a human check on it, things can get to that point much quicker, and we’re not waiting for somebody to manually pull down files and putting them into another location so the next actually take place. We want to automate that part of it so we can get to that final output into a project file where a customer can plug it into their publishing environment and get it out as quickly as possible. A lot of the wasted time is around those manual steps, and when it comes to validation and review, it’s just the reviewers and validators maybe not being ready for the validation or not being educated on how it will work. So it’s important to make sure that everyone in that process knows how it’s going to be done, when things are going to be ready for the review or the QA checks. And then the idea from there is to just feed the content in via connectors, removing the friction point and just send it through. And this is necessary, especially when you’re doing very frequent updates and kind of a more of an agile translation workflow. It gets very cumbersome to continually do these manual steps to get to a translation update. Once the authoring is done, ideally you just send it right through translation and the process starts. BS: So from an agile point of view, I am assuming then that you’re talking about not necessarily translating an entire publication from page one to page 300, but you’re talking about as soon as a particular chunk of content is done and it’s “blessed,” let’s say, by reviewers in the native language, then it can immediately go off to translation even if other portions are still in progress. MM: Exactly. Exactly. And that’s what working in this semantic content and these types of environments will do for a content creator is you don’t need to wait for the final piece of content to be finalized to get things into translation. So as you said, it becomes even more important when you’re doing updates because you don’t want to have to send over the entire file set every time you’re doing an update. Whereas when you’re working in a more linear format like Word, you end up having to send that full file every time, and the translation agency is likely reprocessing it using translation memory. But all that stuff still takes time and working in these types of tools, you can very quickly identify those new parts or those bits that you know are ready for translation, tag them or mark them in some way and send them through the translation process. BS: Very cool. So a lot of the work that we’re seeing now on the Scriptorium side of things is in re-platforming. So people have content in an old system or they have, say a directory full of decaying word files, and they want to bring it into some other new system. They want to modernize, they want to centralize everything, basically have a situation where they’re working in data or some other structured content, bring it into semantic content. What are some of, I guess, the benefits of doing that give you as far as translation goes when you’re looking at content portability? So being able to jump ship from one system to another. MM: I think working in those systems where the text or the content is stored away from the output that you’re building has a lot of benefits to not only translation being able to just get the text that needs to be translated, exported out of the system and then put back where it needs to go. But it really future-proofs you and gives you the portability that you talk about to make changes because the text is stored in a standard format that can be ported versus you see some organizations getting locked into a closed environment to where when it goes to make a change, it requires certain types of exports to other type of file types that other tools can then import. But by storing them in a standard way in XML, for example, it gives you that flexibility in a future proves you from being locked into any one scenario. BS: Excellent. So I have to ask, since I’ve come from a localization background as well, what’s one of the hairier projects that you’ve seen or one of the hairier problems that people can run into and in a localization workflow? MM: One of the challenges we run into sometimes around client review, when you start incorporating validators into the translation system and include them as part of the process, when you get multiple reviewers. Sometimes that will happen where a company will assign a reviewer for every language, but you might have different people reviewing the same set of content. I mean, that’s the biggest delay that we see with projects is translations delivered and then the translation is dumped on a native speaker within the company’s desk and they’re asked to review it and they’re not ready to do the review, it’s not scheduled and it can delay the project. That’s one of the biggest delays we see. So that’s why we try at the front end of a project to figure out on the client side, what’s going to happen after we deliver this project, after we send the files, is the content going to be reviewed or validated? If so, let’s figure out a way to incorporate them into our translation system where they can review the translations before we build the outputs and do all the QA checks. So that’s one of the hairier situations in terms of time delays. Expectations around just time in general have always been a thing in localization. As you know, people can be surprised as to how long it can take for a translator to get through content. I mean, the technology is there certainly to speed it up. Since we’ve started MadTranslations a little over 10 years ago, we’ve seen the translation speed increase quite a bit, but it still takes time for a good translator to get through that content and know when to stop and do the research that’s needed to get a technical term right. So that’s one of the surprise moments I think for new buyers of localization is the time that it can take and there’s solutions in place, like I said, to make it go faster. But if you want that human review and that expertise and the cognitive ability to know when to stop and figure out what this term is or what the client wants or doesn’t want around certain terminology, and then to database it and then include that as part of the translation asset so it stays consistent every time. That takes time versus just sending something through a machine translation, doing a quick spot check and sending it back to the customer. BS: So it sounds like having that workflow defined and setting those expectations that certain things need to happen at each point of that workflow. Some of it might be automated, some of it does require a person, and that person I guess should probably be identified ahead of time and given a heads-up that, “Hey, something’s going to be coming at you in three weeks. Be ready for it.” MM: Be ready for it. And also, what are you ready for? So it’s kind of training a reviewer, what are you looking for here? Are we looking for key terms? Are we looking for style preferences? Everyone kind of understanding what it is that a reviewer is going to be looking for, and they might be looking for different things when it comes to technical documentation versus a website, for example. So just having everyone communicate and understand what the intended purpose of the final output is and where everyone fits in the process and defining a schedule around that process definitely helps. BS: Definitely. I know myself, I’ve seen cases where working for a translation agency, having a client come to me and basically say, “I need this done as soon as possible. What can you do?” And it was a highly technical manual, and we said, “Well, we have an expert in these different languages. This person is available now. This one won’t be available until next month. And this person really only works nights and weekends because they are a professional engineer in their day job.” So turnaround is going to be a little slow, and the client persisted that we just need it as soon as possible. We need to get it out the door in a couple of weeks, and I’m thinking to myself in the back of my head, why are you coming to us now when you need this in a couple of weeks? You shouldn’t just be throwing it over the fence at the last possible minute and expecting it to come back tomorrow. So there was that education. Unfortunately, they decided that they didn’t care. They wanted us to use as many translators as possible and get it done as quick as possible. And we had them sign documents that basically said that we are not liable for the quality of the translation since the client is basically looking to get this done as quickly and cheaply and dirty as possible. It was a nightmare, and I think it took one round of review on the client side for them to basically circle back and say, “Okay, I get what you were saying now.” None of these translations work at all together, because we were literally sending out a chapter to a different translator and there was no style guide because the client hadn’t provided anything. There was no terminology set because the client didn’t provide anything and everything came back different. And they said, “Okay, we get it. We get it. We’ll revise our schedules, get it done the right way. I don’t care how long it takes.” MM: I’ve run into something very, very similar to what you described, and it was put disclaimers in the documents to where this is going to be poor quality. We’re admitting it right now. This is the only way we’re going to get it back within a week, and we do not recommend publishing. And as soon as the files come back and so on, looks at it and says, “Okay, let’s back up and do it the right way.” BS: Yes. I guess the biggest takeaway there is plan ahead and plan for quality and not just try to get it done as fast as possible. MM: And that’s one of the benefits to where we sit at MadTranslations with MadCap Software companies, companies coming into these types of environments. They’re typically at the front end, the planning stages on trying to figure out how all this is going to work. So we have an ability to help them understand what the process looks like and then define it in combination with our tooling and their needs and come up with a workflow that’s going to keep things moving fast, but gives you that human level quality that everyone needs at the end. BS: Being able to size up exactly what the process needs to look like before you’re in the thick of it definitely helps. And having that opportunity to coach someone through setting up the process for the first time, I’d say that’s definitely priceless because so many mistakes can happen out of the gate between how people are authoring content, what their workflow looks like. MM: And it’s even more important for companies to have to maintain the content. So it’s one thing to just take a PDF and say, “Hey, I need to translate this file and I’m never going to have to update it again. I just need a quick translation.” It’s another to have a team of authors dispersed around the globe working on the same set of content that then needs to be translated continuously. So different needs, but like you said, planning, defining the steps and knowing what the requirements of the content are from authoring to time to publication in each language, and how to fit the steps and to meet that as best as possible is best done, like you said, upfront versus when it needs to be published in a week. BS: Planning, planning, planning. I think that sounds like a good place to leave it. Mike, thank you very much. MM: Thank you, Bill. Thanks for having me on. BS: Thank you for listening to the Content Strategy Experts Podcast, brought to you by Scriptorium. For more information, visit scriptorium.com or check the show notes for relevant links. The post Conquering content localization: strategies for success (podcast) appeared first on Scriptorium .…
C
Content Operations

1 Cutting technical debt with replatforming (podcast) 24:05
24:05
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:05
When organizations replatform from one content management system to another, unchecked technical debt can weigh down the new system. In contrast, strategic replatforming can be a tool for reducing technical ... Read more » The post Cutting technical debt with replatforming (podcast) appeared first on Scriptorium .…
C
Content Operations

1 Renovation revelations: Managing technical debt (podcast) 19:12
19:12
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut19:12
Just like discovering faulty wiring during a home renovation, technical debt in content operations leads to unexpected complications and costs. In episode 171 of The Content Strategy Experts podcast, Sarah ... Read more » The post Renovation revelations: Managing technical debt (podcast) appeared first on Scriptorium .…
C
Content Operations

1 Accelerate global growth with a content localization strategy 24:32
24:32
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut24:32
In episode 170 of The Content Strategy Experts podcast, Bill Swallow and Christine Cuellar dive into the world of content localization strategy. Learn about the obstacles organizations face from initial ... Read more » The post Accelerate global growth with a content localization strategy appeared first on Scriptorium .…
C
Content Operations

1 Strategies for AI in technical documentation (podcast, English version) 20:57
20:57
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut20:57
In episode 169 of The Content Strategy Experts podcast, Sarah O’Keefe and special guest Sebastian Göttel of Quanos engage in a captivating conversation on generative AI and its impact on ... Read more » The post Strategies for AI in technical documentation (podcast, English version) appeared first on Scriptorium .…
C
Content Operations

1 Strategien für KI in der technischen Dokumentation (podcast, Deutsche version) 25:17
25:17
Redare mai Târziu
Redare mai Târziu
Liste
Like
Plăcut25:17
Folge 169 ist auf Englisch und Deutsch verfügbar. Da unser Gast Sebastian Göttel sich im deutschsprachigen Raum mit KI beschäftigt, kam die Idee, diesen Podcast auf Deutsch zu erstellen. Die ... Read more » The post Strategien für KI in der technischen Dokumentation (podcast, Deutsche version) appeared first on Scriptorium .…
Bun venit la Player FM!
Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.