Editor’s note: WRAL TechWire contributor Dr. Sarah Glova is a globally recognized speaker, successful entrepreneur, university instructor, and business consultant. A seasoned educator and entrepreneur, Sarah is CEO of the award-winning digital media firm, Reify Media. With a Ph.D. in Instructional Technology and a Master of Science in Technical Communication, she is dedicated to cultivating forward-thinking work environments.
Her new column asks industry leaders to share their insights, experiments, and concerns about AI in their respective fields.
RALEIGH – A recent study conducted by Acendance highlights North Carolina as a leading state for job postings seeking ChatGPT expertise. I reached out to Jessica Mitsch Homes, Co-Founder and CEO of the tech training and recruitment company Momentum, as well as Wake Tech to hear how both are preparing for the changing landscape.
In the study, Acendance examined recent job postings that explicitly sought candidates equipped with ChatGPT skills. The analysis revealed that 19% of these positions were located in California, with Texas following closely at 10%. Notably, North Carolina emerged at the third spot, with 7% of the job postings specifically requesting ChatGPT experience, as outlined in the report.
“California, Texas, and North Carolina lead the way in ChatGPT job creation, thanks to the heavy presence of tech companies in places like Silicon Valley, Austin, and the Research Triangle respectively,” writes Chas Cooper, CEO of Luminos, LLC, the company that publishes Acendance, in the report. “These states have an outsized share of job postings in Engineering, Data Science, and Product Management for software companies and IT services companies.”
Homes says that her community is ready.
“Our stance at Momentum is that we must lean into all available technology people will be expected to use in the workforce—both now and in the future,” Homes told me.
She also shared a quote from NYU professor Scott Galloway, which she said she’s shared widely in the Momentum community: “AI isn’t going to take your job, but someone who understands AI will take your job.”
What do job seekers need to know about ChatGPT?
When I asked Homes what training on AI tools looks like at Momentum, she told me that they are teaching their participants “best practices on how to correctly prompt AI tools.”
A “prompt,” in the context of ChatGPT or other chat-based AI tools, refers to the instruction provided to the language model.
“Knowing how to best frame a question to these tools is critical in order to get the best use of them,” Homes told me.
Homes shared that she advises trainees to get better at prompting the AI tools by “tinkering” with them, and that Momentum encourages its community to tinker so that they can get more experience with writing prompts.
I also spoke with Dr. Keith Babuszczak, Provost of Information Technology programs at Wake Technical Community College, who agreed that experience with AI “is now showing up in job descriptions.”
Babuszczak told me that Wake Tech sees opportunities for students to use tools like ChatGPT to strengthen resumes and prepare for interview questions.
“As with any tool, there are opportunities for misuse; however, we’re finding ways to shine a light on the technology—both the technical aspects behind it and the productive, individual uses for it,” Babuszczak told me.
What do employers need to know about adding ChatGPT experience to their job requirements?
“With the use of free AI tools like Chat GPT, Bard, and the many additional tools that are being developed with OpenAI in the background—we will start seeing more companies request applicants to have experience incorporating AI tools into their workflows,” Homes told me.
She also told me that she hopes employers are “intentional” about adding “ChatGPT experience” as a requirement to their job posts—specifically, that employers aren’t “too specific” about that requirement.
“[W]e often see that new technologies will be written into job descriptions with ‘required years of experience’ that often doesn’t match reality,” Homes told me. “Companies like to request more years of experience than the technology has been around for. This requirement mismatch can deter qualified diverse applicants from applying.”
To illustrate her point, Homes pointed me to the Harvard Business Review study that showed men apply for a job when they meet only 60% of the qualifications, but women apply only if they meet 100% of them.
“We should avoid this and understand that candidates who express curiosity and can give examples of how they’ve experimented with using AI tools is enough to ask of a candidate at this stage in the adaption of this technology,” said Homes.
What does ChatGPT have to say?
As always, I asked ChatGPT for an opinion as well—with the prompt, “More and more job postings are now listing ‘experience with ChatGPT’ as a job requirement. How do you think tech training and job training programs should respond?”
The response focused on how AI might “augment” the training experience:
As more job postings are including ‘experience with ChatGPT’ as a job requirement, tech training and job training programs should respond by incorporating AI language models into their curriculum. They should offer specific courses or modules that focus on understanding and utilizing ChatGPT effectively. These programs should emphasize hands-on experience and practical applications of AI models, enabling learners to develop skills in interacting with and leveraging AI-powered tools.
Additionally, training programs should prioritize teaching critical thinking and ethical considerations related to AI, ensuring that learners understand the limitations, biases, and potential risks associated with AI technologies. By adapting their training offerings to include ChatGPT and related skills, tech training and job training programs can better prepare individuals for the evolving job market and the increasing demand for AI-powered tools and applications.
– ChatGPT inquiry, June 7, 2023
But Homes gets the last word—and she also cautioned that tinkerers should nurture “skepticism” when using these tools and practice fact checking.
“You need to be able to discern if the output information from the tools is accurate,” said Homes. “As users of AI tools, we must have a lot of skepticism, and develop the muscle of checking sources for accuracy.”