Skip to Content

How to Use ChatGPT and MarketMuse to Analyze Your Content

4 min read

There’s a better way to do this! Read Using MarketMuse AI to Create Content and watch the accompanying video.


While most content marketers look to ChatGPT for content generation, it can work equally well at evaluating content. In this post I’ll show you how, with a slight modification, you can change our default Question Answer Prompt into a question evaluation.

It’s something you can use with either net-new content or existing pages. While MarketMuse Content Score gives you a quick read on overall topical coverage, this prompt helps you evaluate individual sections and paragraphs.

Let’s get started!

Let’s Look at The Original Prompt

Here’s what the original prompt looks like. It looks a little odd because of all the text I selected. Normally, if I was using this prompt to answer a question, I’d have just selected “Basics concepts of growing tomatoes”.

Ask A Question default prompt.

Instead, I have a whole bunch of text following that part. It’s the answer to the question, and the content we want to evaluate.

Also note that the question isn’t really formatted as a question. But it doesn’t matter as ChatGPT will give you an answer anyway!

Modifying the Prompt to Evaluate Content

Now let’s make a couple of changes to this prompt because we don’t want it to generate content. We want it to evaluate the content we’ve chosen. Here’s how it looks now.

Some slight modifications to the default prompt.

Let’s keep our focus topic as that’s very important. Next we ask “how well does the following text answer the question.” Although it may not be apparent, the question is in the first few words of the selected text. It’s not phrased as a question, but it’s one of the concepts that appear in the outline.

Fortunately, it doesn’t matter to ChatGPT that it’s not formatted as a question.

Also included in this prompt is a request to rate the text on a scale of 1 to 10, with 10 being the highest. Lastly, we make sure to specifically indicate the body of text to evaluate.

ChatGPT’s Response

Here’s ChatGPT’s response to the new prompt. It generated the text that it’s evaluating, so maybe it shouldn’t come as a surprise that it gave itself a score of 9 out of 10!

One thing’s for sure. ChatGPT isn’t modest, rating the answer it created a 9 out of 10!

The response also provides some justification for the assigned score — somewhat more assuring that just seeing a simple number.

Thoughts on Improving the Prompt

ChatGPT tends to put a positive spin on its responses to this prompt — even to content that I would judge as poor. An improvement, then, would be to get it to offer constructive criticism — simply ask what could be done to improve the text.

This modified prompts solicits detailed feedback from ChatGPT.

One last thing to note is that the ratings (the score out of 10) seem to be arbitrary — it can change substantially with each regeneration. That may be due to a lack of any specific evaluation criteria. In the end, understanding how to make something better is probably more valuable than knowing its current score.

Takeways

The standard question prompt, with a slight modification, can tune into a powerful content evaluation prompt. Use it to evaluate individual paragraphs, sections, or an entire page. It can even generate constructive criticism to help you improve your content.

Stephen leads the content strategy blog for MarketMuse, an AI-powered Content Intelligence and Strategy Platform. You can connect with him on social or his personal blog.

Tweet
Share
Share