General questions

 

What does your company do?
VERN‘s artificial intelligence software detects human emotion in communication. This patented system includes a new model of communication, has incongruity detection, and a process for machine learning refinements from the preliminary to final emotion score. VERN allows AI systems the ability to analyze challenging emotions and difficult expressions of speech such as humor and sarcasm.

How does VERN™ work?

VERN works by analyzing latent emotional clues in text, that we can ingest via voice to text or directly via text input through the software. People leave intentional clues for others to pick up on in order to understand each other’s “personal frame of reference.”

You pass a sentence through the software, and VERN will return that sentence back to you with an emotional recognition analysis on a 1-100 scale of confidence. From there, you can use these insights to create new technologies.


What is it used for?
Wherever a human and a computer meet, there’s a use case for VERN From video games to chatbots; from telehealth technologies to signals intelligence…VERN can help a computer understand how people feel.

What kind of emotions can you detect?

Right now, VERN can detect sadness (96.7% accuracy on internal testing; 80%+ in the field); anger (91% accuracy on internal testing; 80%+ in the field) and of course, humor (79.8%+ accuracy across the board).

We added emotion detectors: Love & Affection and Fear in 2022 with similar accuracy.


Can you really detect emotions without facial recognition, or voice?

Absolutely. In fact, facial recognition is fraught with bias and error, and people can easily cover their emotions. Ever hear of a poker face?

Here’s a thought exercise: Try to communicate to your spouse where their keys are inside your house using only facial expressions…

Or try to order a pizza…not using words…but by using inflections and noise alone.

You can’t.

The fact is, words are what we use to label and categorize our world. It’s the vehicle in which we codify our thoughts. Voice inflections and quirks, and the physiological expressions only add additional information on the words that are used.

VERN‘s accuracy of 80%+ in internal and real-world applications is on words ALONE. Just imagine how powerful your emotional recognition would be with VERN

…because it’s not as accurate without it.

 

Financial topics

 

How much does VERN™ cost?
Accessing VERN can be accomplished in several ways, and as such there are different costs depending on the model you want to use. We’ve made low-cost options available for hobbyists, researchers and mobile app developers. We have enterprise solutions with custom domain framing, and advanced analytics.

Which option is best for me?

Need a secure, offline solution? We have models ready in the AI marketplaces ready to be deployed in your on-premise solution.

Need a custom frame? We can work with your organization to create frames of reference for more accurate analysis.

Want to try it out? We have an API only solution for you, that you can register for right here on this site.


How do you bill?

To use our API, simply register an account and put a card on file. We will charge you after each month for what you’ve used to date.

Marketplaces bill based on a pre-purchased credit basis, and as we’re a product in their ecosystem they take care of the billing.

For enterprise solutions, we charge the package cost at the end of the month. Custom framing development and implementation is extra.


Do you offer refunds?
Unfortunately, we do not offer refunds. Our process offers opportunities to try out the software risk-free. Once we provide the service, it can’t be returned.
 

Other questions

 

How is VERN™ better than sentiment analysis tools?
Sentiment analysis has its place. Unfortunately, to get any truly actionable information you need to dig deeper. Simply knowing an entire block of text is “positive,” “negative,” or “neutral” does nothing to help identify the root of the problem. A sentence by sentence analysis of the emotions present in each one–can help identify where the problem lies. And, what the hell does “mixed” even tell you? Nothing!

Is VERN™ biased?
With inherent biases affecting the analysis of facial and voice tools, it’s a legitimate question to ask. The answer is simply: No. VERN™ provides a generalized interpretation of the emotions present, for all people. With the addition of personal and domain frames, they can influence the interpretation and make it more personal by group (and therefore biased for that frame). The general algorithm is agnostic.

Should I be afraid of the ‘terminator’?
No. Don’t be silly. AI is still far away from that…VERN™ was engineered to act like a human “receiver,” or recipient of communication. It is a big ear and processor, that will help all sorts of people and computers understand our world. VERN™ only reacts to what users provide it. It can’t read minds.

What do you see in the future?
We see a time in which autonomous and semi-sentient machines interacting with humans on a daily basis. Soon, technologies enabled with VERN™ will enable chatbots to stop being so–robotic. It’ll enable customer service representatives to understand the difference between a sarcastic joke and a real expression of anger. VERN™ will enable product marketers to test the effectiveness of their campaigns. Wherever a human and a computer meet, there’s a need for VERN™.