What is the icc method
3/4/2026 09:52am
The “ICC method” usually refers to the **Intraclass Correlation Coefficient (ICC)**, a statistical method used to measure **reliability or agreement** when:
- You have **multiple measurements** of the same thing (e.g., repeated tests, multiple raters, multiple devices), and
- You want to know **how consistent** those measurements are.
---
## What ICC Measures
ICC answers questions like:
- *“Do different raters give similar scores?”*
- *“Are repeated measurements on the same subject stable?”*
It quantifies this on a **0–1 scale**:
- **< 0.5** → poor reliability
- **0.5–0.75** → moderate
- **0.75–0.9** → good
- **> 0.9** → excellent
(Thresholds vary by field, but this is a common rule of thumb.)
---
## How It Works (Conceptually)
ICC is based on **variance components**:
- Total variance = variance **between subjects** + variance **within subjects** (error / rater differences).
- ICC ≈ variance_between / (variance_between + variance_within)
So:
- If most variance is **between subjects** → high ICC (measurements agree well).
- If a lot of variance is **within subjects** (raters disagree, unstable over time) → low ICC.
This is usually estimated via an **ANOVA model** (one-way or two-way).
---
## Common ICC Types
You’ll often see notations like **ICC(1)**, **ICC(2)**, **ICC(3)**, sometimes with “single” vs “average” and “consistency” vs “absolute agreement”:
- **One-way random (ICC(1))** – subjects rated by different raters chosen at random; not all raters rate all subjects.
- **Two-way random (ICC(2))** – all subjects rated by all raters, raters considered random sample from a larger population.
- **Two-way mixed (ICC(3))** – all subjects rated by all raters, but raters are fixed (you care only about these raters).
Plus:
- **Single measures** – reliability of **one** rater/measurement.
- **Average measures** – reliability of the **mean of several** raters/measurements (usually higher ICC).
---
## Simple Example
Suppose 3 analysts each rate 20 companies on a risk score from 1–10. You compute ICC:
- If **ICC = 0.85** → their scores are **highly consistent**.
- If **ICC = 0.30** → their scores are **not very reliable**; who rates the company matters a lot.
---
To fine-tune the explanation:
Are you asking about ICC in **statistics/research (reliability)**, in **trading/finance**, or in some **other field**?