Download program as PDF
Download here
But as users we expect that the information uttered by the devices should be true, distinct and precise.
Not derived from statistical calculations based on heaps of unstructured data.
To achieve that, user information has to be semantically enriched by knowledge graphs.
Applying iiRDS in this context is a huge step towards consistent and highly reliable statements.
2. warm up session (3 mins)
3. Keynote Speech (20 mins): an inspiring speech concerning the leverage of C-end design to achieve breakthroughs in B-end documentation.
• Construction logic of GSM model and measurement system
• From scenario-based analysis to delivery of design
• Four behavioral patterns of B-end user in content consumption
• Return back to design and achieve value loop
4.Huichuan's practical experience sharing (20 mins)
• Analysis of current situation and corresponding measures
• Construction of evaluation system for document quality
• Practical implementation of the evaluation system for document quality
5. Hands-on Activities 1: Analyze the current state of document development business in a specific industry/company, focusing on key quality issues
Time: 20 mins
Form: Group-based, selecting a specific industry/company as a representative
Requirements: Reach a consensus within the group and arrive at clear conclusions
Sharing: Choose representatives to share the results
6. Hands-on Activities 2: Practice analyzing strategies (activity kit provided on-site)
Time: 20 mins
Form: Continued to focus on the issues from activity 1 and utilize the activity kit to generate analysis results (e.g. metrics, test cases)
Requirements: Reach a consensus within the group and arrive at clear conclusions
Sharing: Choose representatives to share the results
7. Hands-on Activities 3: Practice to apply quality evaluation
Time: 20 mins
Form: Select samples from the mentioned product materials in activity 1, apply metrics and test cases for on-site measurement activities
Requirements: Reach a consensus within the group and arrive at clear conclusions
8. Closing: Presentation of results by each group
Time: 5 mins per group, 20 mins in total
Voting: Each person has 3 voting cards to evaluate the activities of each group
Award: Select one group to receive the Outstanding Award and conduct a recognition ceremony
How to achieve a higher degree of standardization and automation in the manufacturer’s information preparation and exchange and the information processing? To improve the process efficiency at all participants, so that everyone can benefit from it, is the goal of the technology of the Digital Data Chain (DDC). This is a combined technology including the IEC 61406 identification link, the VDI 2770 digital manufacturer information and the information exchange platform (IEP). We will focus on the sharing of VDI 2770 based digital manufacturer information.
2. Most of companies invested in CAD development and have BOM of products but they have to invest again to create traditional technical documentation from the scratch. Cortona3D technology lets to make a step from new investments to old fashion documents to reuse of exiting CAD 3D data for automated generation of technical 3D documentation of new generation and quality.
3. This documentation is updatable, can be visualized using any web browser, supports Augmented Reality technology and can be easy managed with CMS.
4. This technology lets companies to prolong product lifecycle management from design and manufacturing steps to aftersales linking together and reusing all product data company already has for production of modern technical documentation with significantly less time and effort.
In this topic, we will use ODX file (Open Diagnostic Exchange, which is a standardized file format used in various industries for the exchange of diagnostic data between different diagnostic tools and software applications) as the inputs, through the utilization of combined tools incl. R&D ODX, ChatGPT, CMS, Quality tool and appropriate Service & Diagnostics knowledge, we will talk about and demo a brand-specific approach & practice on how to significantly improve the error codes description production time, optimize the fault-tracing efficiency and the user experience from aftermarket diagnostics technicians' perspective.
This workshop explores the evolution of STE from an aerospace standard to its potential in revolutionising documentation practices in the software industry. By embracing STE, participants will gain insights into how this linguistic paradigm shift can help organisations across diverse sectors enhance clarity, efficiency, and comprehension in their software-related communication. The flexibility of Simplified Technical English allows for customisation to suit unique requirements. As the specification primarily involves additions to the dictionary, clients themselves often make customisations to cater to different projects within the organisation.
Your team applies terminology lists, which do not (entirely) comply with the terminology lists of other teams in your company. Your company’s terminology is inconsistent for team colleagues and customers alike. All this leads to unnecessary confusion and miscommunication.
Then let us take you with us on our terminology race!
In this presentation, we, Porsche project lead Ira Rotzler and Kerstin Berns, managing partner of berns language consulting will share with you, why and now we started the Porsche terminology race, where we stand right now, and how we plan to cross the finishing line ?!
We will demonstrate to you how we unified a great number of very differently formatted terminology lists into one neat data set and how we cleaned up all that data. We will also share with you how we managed to lay the foundation for high-quality terminology management which is useful to the entire Porsche brand in all countries, worldwide! And of course, we will shed light on how we automated this process and which methodologies and tools where applied.
In the end, we will talk about how the project team successfully created a reliable and useful database that benefits Porsche in various ways, e.g., by improving recognition of vehicle functions and their understandability throughout different customer touchpoints.
Finally, we take a sneak peek into the international future of multilingual terminology work at Porsche as well as at the future usefulness of clean terminology for simply everything!
1. Changes brought by the emergence of large models to database technology dissemination
The impact of technological development on the workflow of technical dissemination. Iteration of the workflow
2. Exploring the introduction of large models into database technology dissemination
An introduction to exploring technical dissemination practices and workflow improvements using the OceanBase database as an example
3. Comparative display of offline LLM results based on corpus optimization
Online large model corpus testing
4. Future directions and focuses of database technology dissemination
Even if some is still trying to move from PDF to HTML, we think that the next era will be direct human interaction with that content
Ekrai is an example of a Personal Digital Expert: she is an expert and you can ask her any question related to any detail of your product, both for pre-sales of after sales and technical documentation needs.
We will explain how we made it using latest Generative AI technologies and how it could be considered a “best practice” to be followed in many companies
First, define objectives. While maintaining the same level of delivery quality, consider the following:
- How can we reduce the frequency of mouse clicks?
- How can we minimize the manual operations and decision-making that are prone to errors?
- How can we shorten the time spent waiting for asynchronous information inputs?
- How can we implement clear division of labor and collaboration rules with minimal communication costs?
Considering the methods for automation to enhance efficiency, we have the following approaches:
- Keyboard shortcuts, graphical user interface (GUI) quick buttons, (command line) command aliases
- Regular expressions and functions in data tables
- Macros and scripts
- Workflow automation tools
Next, analyze the workflow and identify inefficient processes.
Finally, by integrating the automation techniques across various dimensions while considering constraints on manpower and budget, design a comprehensive solution to be tested, iterated upon, and implemented.
This presentation will use the example of my toolchain (AsciiDoc + Git/GitLab + VS Code + Feishu) to illustrate efficiency improvement methods across different dimensions, particularly the utilization of API scripts to connect multiple tools and streamline automation processes.
Firstly, I will provide a brief overview of the principles and development of Generative AI, highlighting its groundbreaking advancements in the field of natural language processing, such as large language models (LLMs). I will then delve into the specific applications of these technologies in translation services, including accelerating translation speed and improving quality, optimizing terminology consistency and contextual accuracy, promoting the sharing and reuse of language resources, and assisting translators in engaging in more creative and value-added work.
Furthermore, the presentation will demonstrate the auxiliary functions provided by Generative AI for translation services, such as translation memory, terminology management, and automatic proofreading tools, and explore new modes of collaboration between AI and translators.
Lastly, I will discuss the challenges and opportunities of applying Generative AI in translation services, including issues related to data privacy, terminology consistency, and cultural sensitivity. I will also provide insights into the innovative trends shaping the future of translation services and offer feasible strategies and suggestions for practitioners.
2. From the reader's perspective, the combination of text and graphics is beneficial for understanding. Take product after-sales service as an example. When customers request part replacements, yet employ different part names from the manufacturers, the drawings can assist both customers and manufacturers in pinpointing the exact parts.
3. Production specifications and quality criteria for document drawings. Engineering or research drawings are transformed into document drawings, ensuring the accuracy of document drawings while considering the confidentiality of the drawings. Drawings and text complement each other. Line graphs, being vector graphics, can be easily zoomed in and out on the web. Bitmap can balance clarity and file storage.
4. Pictorial documents. Documents primarily composed of images with the text as secondary elements. This sharing mainly introduces the 3D rendering of pictorial documents.
5. Dynamic documents (defined as videos). In the age of media, videos serve not only to entertain the public and promote products but also to demonstrate product disassembly and maintenance.
For any documentation, authors and subject matter experts work together to arrive at quality. I strongly believe that it is a shared responsibility for customer satisfaction.
Our various quality initiatives in improving the process as well as Root Cause Analysis (RCA) in Tech Comms did not disappoint us and helped to improve process and reduce the recurrence of defects. We have been trying out RCA on documentation defects since 2014 in our team. This has helped fix many of the root causes of documentation defects and at the same time strengthened the foundation of the process. Here we share our experience of how we used RCA and related activities like Bugathon, to progressively enhance documentation quality along with process.
This is a big limit to get to the Generative AI world because it is clear that Enterprise Knowledge must be structured to avoid AI Hallucinations.
In this workshop we will see how it is possible to move from unstructured miscellaneous documents to a super high quality structured knowledge base.
We will learn concepts Knowledge Architecture
We will learn concepts of Prompt Engineering
We will see how a Structured Workflow is important to get to a high quality structured knowledge base
At the end we will see how we can use this structured knowledge if we feed a RAG scenario
✅ Are you looking for precise content marketing strategies to capture developers' interest, seize their attention, and unleash their potential?
✅ Are you also thinking about questions like what are the characteristics of AI large model developers compared to traditional ones? What kind of decision-making journey do they go through in product selection? What are the gains, pains, and key considerations at each stage?
✅ Have you noticed that although AI model developers are generally not interested in marketing tactics, they unconsciously buy in certain marketing strategies. So, what is the magic of these strategies?
✅ Are you curious about how top AI model vendors attract and empower users through technical content and marketing? How they build and deepen trust through community activities and support services? And how they establish and expand ecosystems through product iteration and publicity?
This talk, with the author's work regarding technical content/open source community/developer operation in 01.AI, a large model company, as an example, will analyze practical dilemmas, sort out thinking processes, explore problem solving ideas, and provide solutions, helping create high-quality content, strengthening marketing strategies, and expanding your network, so as to build a vibrant open-source AI developer community.
In this speech, I will explain PLG, discuss how it affects product content demand, analyze methods of formulating corresponding content strategies, and explore the development trend of the content team based on their capabilities. I hope that this speech will provide a clearer understanding of the new industry directions and offer insights into the transformation of content team.
Due to the diverse and complex products and customer demands, technical document often contains a large volume of content and complex information architecture. How can we provide effective support and assistance to customers, enabling them to access the information they need in the simplest, most timely, and accurate manner amidst a wealth of information, thus enhancing their user experience?
To achieve this, we need to analyze factors such as information labeling, product attributes, and customer's usage scenarios, utilizing cloud platforms to establish information delivery channels between the content and customers, thus enabling precise matching, distribution, and delivery of technical information in a digital format from creation to publication process.
In this way we can reduce the reliance on printed materials, thereby lowering material costs and allowing for timely content updates and more flexibility in design cycles.
I will also address the challenges and ethical considerations arising from the integration of AI in technical communication. As AI algorithms autonomously generate content, there is a need for scrutiny to ensure accuracy, clarity, and adherence to ethical standards. Moreover, the discussion will extend to the importance of maintaining equilibrium between automated and human elements in technical communication, emphasizing the collaborative potential embodied in AI-human partnerships, often referred to as the “human-in-the-loop” approach.
In summary, this presentation invites attendees to engage in insightful discussions on navigating the evolving rhetorical situations of technical communication in the era of generative AI. By exploring both the advancements and ethical dimensions, participants will gain valuable insights into strategies for effectively adapting to the transformative impact of AI on technical communication practices.
As a startup providing end-to-end solutions in the field of chip verification, XEPIC offers a comprehensive range of seven product series that cover the demands of digital chip verification and provides effective verification solutions. Therefore, XEPIC is in pressing and intricate demand for technical document. Its document team has adopted the "Docs as Code" concept to establish a lightweight document development platform from the ground up, which combines open-source and self-developed tools. This platform effectively meets the company's requirements for bilingual (Chinese and English) publishing and multiple format (PDF and HTML) and style options for publishing. In this speech, the team will share their insights and experience gained from the practices, offering inspiration to professionals facing similar requirements.
The application of LLMs in technical writing competitions nationwide vividly demonstrates their potential in enhancing writing efficiency and content innovation. From planning and development to revision and delivery, LLMs span the entire document creation process, significantly improving the quality and efficiency of the entire writing workflow.
The shift from manual content to conversational chatbots markedly enhances the user's interactive experience. The instant response and personalized assistance of chatbots make it possible for users to obtain information based on specific needs, which not only enhances user satisfaction but also brings new opportunities to professionals in technical writing.
The use of LLMs in user support systems showcases their precision and efficiency in handling complex queries. Leveraging LLMs, we can provide a more interactive and customized user experience, whether the user prefers text, voice, or video interactions.
In summary, the development of LLMs represents a significant leap in user assistance, transitioning from paper manuals to intelligent chatbots. This not only greatly improves the user experience but also brings revolutionary changes to the technical writing industry.
This workshop will explore the convergence between traditional content and AI, and how to communicate traditional content using AI to optimize user experience. We will utilize the IBM AI Essentials Framework and tools to guide you through the process of developing strategies with AI as the medium. Together, we will work towards a new model of AI content co-design.
2. A concise overview of RISC-V, the next-generation computer instruction set architecture, and its domestic and international development today.
3. Introducing the current state of products and documentation development under RISC-V architecture.
4. Why is DITA the best partner for content development under the new instruction set srchitecture RISC-V?
a. DITA's topic-based authoring V.S. the smooth transfer from IP to final devices in the lifecycle of IC products
b. DITA's customization V.S. RISC-V's high level of customization and modularization
c. DITA's reuse and filtering features V.S. the fragmented nature of the RISC-V and its associated products
d. DITA's collaborative sharing V.S. RISC-V's diverse ecology characterized by vendor's leadership and simultaneous operation of various communities
The premier gathering event for all decision-makers and specialists in the field of technical communication in China.
WeChat Official Account: