PawRead
Meet your dog's
emotional translator
Tell us a little about your dog and we'll personalize every analysis for them.
🐶
Biscuit
🐾
Free beta — enjoy unlimited analyses!
We would love your feedback as we improve PawRead.
📷
Camera
captured
🐕
Point camera at your dog
LIVE
🎤
Tap to record sounds
JPG, PNG or WebP — iPhone users: convert HEIC to JPG first
🌿
Current situation
✏️
Additional notes
🔍
Analysis
🐾
Start your camera and tap Analyze
to read what your dog is feeling
Reading body language…
Consulting behavioral science knowledge
😊
Happy
AI confidence in this reading
What you can do
⚠️

About this analysis: PawRead uses Claude AI (Anthropic) to interpret visual cues — body posture, ear position, tail, facial expression — alongside the context you provided. It draws on published canine behavioral science and the Dog Facial Action Coding System (DogFACS). Accuracy is not guaranteed and varies by image quality, lighting, breed, and angle. This is not a substitute for professional veterinary or behavioral advice.

Add another dog
Got more than one? Add their profile so you can switch between them.
Edit pet profile
Update your pet's details anytime.
PawRead FAQ

PawRead uses Claude AI (built by Anthropic) to analyze your pet's emotional state from a photo or video frame. Each analysis looks at visual cues including body posture, ear position, tail position, facial expression, eye shape, and overall muscle tension — combined with the context you provide about breed, age, and situation.

The AI is guided by published behavioral science research, including prompts grounded in the methodologies below.

DogFACS — Dog Facial Action Coding System

Developed by researchers at the University of Portsmouth and University of Lincoln, DogFACS maps specific facial muscle movements in dogs to emotional states. It is the gold standard for scientific canine emotion research and has been used in peer-reviewed studies published in journals including Scientific Reports and Animal Cognition.

CatFACS — Cat Facial Action Coding System

CatFACS applies the same methodology to cats, developed by researchers at the University of Lincoln. It identifies facial action units in cats that correspond to different internal states, and has been used in pain assessment research (the Feline Grimace Scale) as well as emotional state studies.

Body language research

In addition to facial analysis, PawRead incorporates research on whole-body posture signals including tail position and movement, ear orientation, weight distribution, and coat condition — drawing on ethological research in canine and feline behavioral science.

Dogs: Current AI vision models achieve 60–75% accuracy on dog emotion classification in controlled research settings. Real-world accuracy varies based on image quality, lighting, angle, and breed — some breeds have more expressive faces than others.

Cats: Cats are naturally more subtle in their emotional expression — they have evolved to conceal internal states. Expect somewhat lower confidence scores for cats. CatFACS research is less extensive than DogFACS, and accuracy is estimated at 50–65% in research settings.

⚠️ PawRead is not a substitute for professional veterinary or behavioral advice. If you are concerned about your pet's health or behavior, please consult a qualified veterinarian.

Photos you take or upload are sent to Anthropic's Claude API for analysis and are not stored on our servers. Your pet profiles and analysis history are stored locally on your device. Your email address is stored in Mailchimp for our newsletter — you can unsubscribe at any time.

We do not sell your data. We do not share your photos or personal information with third parties beyond what is necessary to provide the analysis service (Anthropic API).

Questions, feedback, or bug reports: [email protected]