1
0
mirror of https://github.com/esphome/esphome.git synced 2025-11-01 07:31:51 +00:00

Compare commits

..

1 Commits

4172 changed files with 33033 additions and 64010 deletions

View File

@@ -9,7 +9,7 @@ This document provides essential context for AI models interacting with this pro
## 2. Core Technologies & Stack
* **Languages:** Python (>=3.11), C++ (gnu++20)
* **Languages:** Python (>=3.10), C++ (gnu++20)
* **Frameworks & Runtimes:** PlatformIO, Arduino, ESP-IDF.
* **Build Systems:** PlatformIO is the primary build system. CMake is used as an alternative.
* **Configuration:** YAML.
@@ -38,7 +38,7 @@ This document provides essential context for AI models interacting with this pro
5. **Dashboard** (`esphome/dashboard/`): A web-based interface for device configuration, management, and OTA updates.
* **Platform Support:**
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (Original, C2, C3, C5, C6, H2, P4, S2, S3) with ESP-IDF framework. Arduino framework supports only a subset of the variants (Original, C3, S2, S3).
1. **ESP32** (`components/esp32/`): Espressif ESP32 family. Supports multiple variants (S2, S3, C3, etc.) and both IDF and Arduino frameworks.
2. **ESP8266** (`components/esp8266/`): Espressif ESP8266. Arduino framework only, with memory constraints.
3. **RP2040** (`components/rp2040/`): Raspberry Pi Pico/RP2040. Arduino framework with PIO (Programmable I/O) support.
4. **LibreTiny** (`components/libretiny/`): Realtek and Beken chips. Supports multiple chip families and auto-generated components.
@@ -60,7 +60,7 @@ This document provides essential context for AI models interacting with this pro
├── __init__.py # Component configuration schema and code generation
├── [component].h # C++ header file (if needed)
├── [component].cpp # C++ implementation (if needed)
└── [platform]/ # Platform-specific implementations
└── [platform]/ # Platform-specific implementations
├── __init__.py # Platform-specific configuration
├── [platform].h # Platform C++ header
└── [platform].cpp # Platform C++ implementation
@@ -150,8 +150,7 @@ This document provides essential context for AI models interacting with this pro
* **Configuration Validation:**
* **Common Validators:** `cv.int_`, `cv.float_`, `cv.string`, `cv.boolean`, `cv.int_range(min=0, max=100)`, `cv.positive_int`, `cv.percentage`.
* **Complex Validation:** `cv.All(cv.string, cv.Length(min=1, max=50))`, `cv.Any(cv.int_, cv.string)`.
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `esp32.only_on_variant(...)`, `cv.only_on_esp32`, `cv.only_on_esp8266`, `cv.only_on_rp2040`.
* **Framework-Specific:** `cv.only_with_framework(...)`, `cv.only_with_arduino`, `cv.only_with_esp_idf`.
* **Platform-Specific:** `cv.only_on(["esp32", "esp8266"])`, `cv.only_with_arduino`.
* **Schema Extensions:**
```python
CONFIG_SCHEMA = cv.Schema({ ... })
@@ -169,8 +168,6 @@ This document provides essential context for AI models interacting with this pro
* `platformio.ini`: Configures the PlatformIO build environments for different microcontrollers.
* `.pre-commit-config.yaml`: Configures the pre-commit hooks for linting and formatting.
* **CI/CD Pipeline:** Defined in `.github/workflows`.
* **Static Analysis & Development:**
* `esphome/core/defines.h`: A comprehensive header file containing all `#define` directives that can be added by components using `cg.add_define()` in Python. This file is used exclusively for development, static analysis tools, and CI testing - it is not used during runtime compilation. When developing components that add new defines, they must be added to this file to ensure proper IDE support and static analysis coverage. The file includes feature flags, build configurations, and platform-specific defines that help static analyzers understand the complete codebase without needing to compile for specific platforms.
## 6. Development & Testing Workflow
@@ -186,11 +183,6 @@ This document provides essential context for AI models interacting with this pro
└── components/[component]/ # Component-specific tests
```
Run them using `script/test_build_components`. Use `-c <component>` to test specific components and `-t <target>` for specific platforms.
* **Testing All Components Together:** To verify that all components can be tested together without ID conflicts or configuration issues, use:
```bash
./script/test_component_grouping.py -e config --all
```
This tests all components in a single build to catch conflicts that might not appear when testing components individually. Use `-e config` for fast configuration validation, or `-e compile` for full compilation testing.
* **Debugging and Troubleshooting:**
* **Debug Tools:**
- `esphome config <file>.yaml` to validate configuration.

View File

@@ -1 +1 @@
049d60eed541730efaa4c0dc5d337b4287bf29b6daa350b5dfc1f23915f1c52f
0c2acbc16bfb7d63571dbe7042f94f683be25e4ca8a0f158a960a94adac4b931

View File

@@ -21,6 +21,10 @@ body:
Provide a clear and concise description of what the problem is.
⚠️ **WARNING: Do NOT paste logs, stack traces, or error messages here!**
Use the "Logs" section below instead. Issues with logs
in this field will be automatically closed.
- type: markdown
attributes:
value: |
@@ -79,7 +83,7 @@ body:
- type: textarea
id: logs
attributes:
label: Anything in the logs that might be useful for us?
label: Logs
description: For example, error message, or stack traces. Serial or USB logs are much more useful than WiFi logs.
render: txt
- type: textarea

View File

@@ -47,7 +47,7 @@ runs:
- name: Build and push to ghcr by digest
id: build-ghcr
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@v6.18.0
env:
DOCKER_BUILD_SUMMARY: false
DOCKER_BUILD_RECORD_UPLOAD: false
@@ -73,7 +73,7 @@ runs:
- name: Build and push to dockerhub by digest
id: build-dockerhub
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@v6.18.0
env:
DOCKER_BUILD_SUMMARY: false
DOCKER_BUILD_RECORD_UPLOAD: false

View File

@@ -17,12 +17,12 @@ runs:
steps:
- name: Set up Python ${{ inputs.python-version }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: ${{ inputs.python-version }}
- name: Restore Python virtual environment
id: cache-venv
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache/restore@v4.2.3
with:
path: venv
# yamllint disable-line rule:line-length

View File

@@ -0,0 +1,248 @@
name: Auto-close issues with logs in problem field
on:
issues:
types: [opened]
issue_comment:
types: [created]
workflow_dispatch:
inputs:
issue_number:
description: 'Issue number to check for logs'
required: true
type: number
jobs:
check-logs-in-problem:
runs-on: ubuntu-latest
if: github.event.issue.state == 'open' || (github.event_name == 'issue_comment' && contains(github.event.comment.body, '@esphomebot reopen')) || github.event_name == 'workflow_dispatch'
steps:
- name: Check for logs and handle issue state
uses: actions/github-script@v7.0.1
with:
script: |
// Handle different trigger types
let issue, isReassessment;
if (context.eventName === 'workflow_dispatch') {
// Manual dispatch - get issue from input
const issueNumber = ${{ github.event.inputs.issue_number }};
console.log('Manual dispatch for issue:', issueNumber);
const issueResponse = await github.rest.issues.get({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: parseInt(issueNumber)
});
issue = issueResponse.data;
isReassessment = false; // Treat manual dispatch as initial check
} else {
// Normal event-driven flow
issue = context.payload.issue;
isReassessment = context.eventName === 'issue_comment' && context.payload.comment.body.includes('@esphomebot reopen');
}
console.log('Event type:', context.eventName);
console.log('Is reassessment:', isReassessment);
console.log('Issue state:', issue.state);
// Extract the problem section from the issue body
const body = issue.body || '';
// Look for the problem section between "### The problem" and the next section
const problemMatch = body.match(/### The problem\s*\n([\s\S]*?)(?=\n### |$)/i);
if (!problemMatch) {
console.log('Could not find problem section');
if (isReassessment) {
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
body: '❌ Could not find the "The problem" section in the issue template. Please make sure you are using the proper issue template format.'
});
}
return;
}
const problemText = problemMatch[1].trim();
console.log('Problem text length:', problemText.length);
// Function to check if text contains logs
function checkForLogs(text) {
// Patterns that indicate logs/stack traces/error messages
const logPatterns = [
// ESPHome specific log patterns with brackets
/^\[[DIWEVC]\]\[[^\]]+(?::\d+)?\]:/m, // [D][component:123]: message
/^\[\d{2}:\d{2}:\d{2}\]\[[DIWEVC]\]\[[^\]]+(?::\d+)?\]:/m, // [12:34:56][D][component:123]: message
/^\[\d{2}:\d{2}:\d{2}\.\d{3}\]\[[DIWEVC]\]\[[^\]]+(?::\d+)?\]:/m, // [12:34:56.123][D][component:123]: message
// Common log prefixes
/^\[[\d\s\-:\.]+\]/m, // [timestamp] format
/^\d{4}-\d{2}-\d{2}\s+\d{2}:\d{2}:\d{2}/m, // YYYY-MM-DD HH:MM:SS
/^\w+\s+\d{2}:\d{2}:\d{2}/m, // INFO 12:34:56
// Error indicators
/^(ERROR|WARN|WARNING|FATAL|DEBUG|INFO|TRACE)[\s:]/mi,
/^(Exception|Error|Traceback|Stack trace)/mi,
/at\s+[\w\.]+\([^)]*:\d+:\d+\)/m, // Stack trace format
/^\s*File\s+"[^"]*",\s+line\s+\d+/m, // Python traceback
// Legacy ESPHome log patterns
/^\[\d{2}:\d{2}:\d{2}\]\[/m, // [12:34:56][component]
/^WARNING\s+[^:\s]+:/m, // WARNING component:
/^ERROR\s+[^:\s]+:/m, // ERROR component:
// Multiple consecutive lines starting with similar patterns
/(^(INFO|DEBUG|WARN|ERROR)[^\n]*\n){3,}/mi,
/(^\[[DIWEVC]\]\[[^\]]+\][^\n]*\n){3,}/mi, // Multiple ESPHome log lines
// Hex dumps or binary data
/0x[0-9a-f]{4,}/i,
/[0-9a-f]{8,}/,
// Compilation errors
/error:\s+/i,
/:\d+:\d+:\s+(error|warning):/i,
// Very long lines (often log output)
/.{200,}/
];
const hasLogs = logPatterns.some(pattern => {
const matches = pattern.test(text);
if (matches) {
console.log('Pattern matched:', pattern.toString());
}
return matches;
});
// Additional heuristics
const lineCount = text.split('\n').length;
const hasLotsOfLines = lineCount > 20; // More than 20 lines might be logs
const hasCodeBlocks = (text.match(/```/g) || []).length >= 2;
const longCodeBlock = hasCodeBlocks && text.length > 1000;
console.log(`Lines: ${lineCount}, Has logs: ${hasLogs}, Long code block: ${longCodeBlock}`);
return hasLogs || (hasLotsOfLines && longCodeBlock);
}
const hasLogsInProblem = checkForLogs(problemText);
// Handle reassessment (when @esphomebot is mentioned)
if (isReassessment) {
if (!hasLogsInProblem) {
// No logs found, check if issue was auto-closed and reopen it
if (issue.state === 'closed') {
// Check if it has the auto-closed label
const labels = issue.labels.map(label => label.name);
if (labels.includes('auto-closed')) {
console.log('Reopening issue - logs have been moved');
await github.rest.issues.update({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
state: 'open'
});
// Remove auto-closed and invalid labels
await github.rest.issues.removeLabel({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
name: 'auto-closed'
});
await github.rest.issues.removeLabel({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
name: 'invalid'
});
// Find and edit the original auto-close comment
const comments = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number
});
const autoCloseComment = comments.data.find(comment =>
comment.user.login === 'github-actions[bot]' &&
comment.body.includes('automatically closed because it appears to contain logs')
);
if (autoCloseComment) {
const updatedComment = `✅ **ISSUE REOPENED**
Thank you for helping us maintain organized issue reports! 🙏`;
await github.rest.issues.updateComment({
owner: context.repo.owner,
repo: context.repo.repo,
comment_id: autoCloseComment.id,
body: updatedComment
});
}
}
}
} else {
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
body: '❌ Logs are still detected in the "The problem" section. Please move them to the "Logs" section and try again.'
});
}
return;
}
// Handle initial issue opening
if (!hasLogsInProblem) {
console.log('No logs detected in problem field');
return;
}
console.log('Logs detected in problem field, closing issue');
// Close the issue
await github.rest.issues.update({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
state: 'closed'
});
// Add a comment explaining why it was closed
const comment = `This issue has been automatically closed because it appears to contain logs, stack traces, or error messages in the "The problem" field.
⚠️ **Please follow the issue template correctly:**
- Use the "The problem" field to **describe** your issue in plain English
- Put logs, error messages, and stack traces in the "Logs" section instead
To reopen this issue:
1. Edit your original issue description
2. Move any logs/error messages to the appropriate "Logs" section
3. Rewrite the "The problem" section with a clear description of what you were trying to do and what went wrong
4. Comment exactly \`@esphomebot reopen\` to reassess and automatically reopen if fixed
Thank you for helping us maintain organized issue reports! 🙏`;
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
body: comment
});
// Add labels
await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
labels: ['invalid', 'auto-closed']
});

View File

@@ -11,10 +11,51 @@ permissions:
contents: read
env:
TARGET_PLATFORMS: |
esp32
esp8266
rp2040
libretiny
bk72xx
rtl87xx
ln882x
nrf52
host
PLATFORM_COMPONENTS: |
alarm_control_panel
audio_adc
audio_dac
binary_sensor
button
canbus
climate
cover
datetime
display
event
fan
light
lock
media_player
microphone
number
one_wire
ota
output
packet_transport
select
sensor
speaker
stepper
switch
text
text_sensor
time
touchscreen
update
valve
SMALL_PR_THRESHOLD: 30
MAX_LABELS: 15
TOO_BIG_THRESHOLD: 1000
COMPONENT_LABEL_THRESHOLD: 10
jobs:
label:
@@ -22,521 +63,107 @@ jobs:
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Get changes
id: changes
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Get PR number
pr_number="${{ github.event.pull_request.number }}"
# Get list of changed files using gh CLI
files=$(gh pr diff $pr_number --name-only)
echo "files<<EOF" >> $GITHUB_OUTPUT
echo "$files" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
# Get file stats (additions + deletions) using gh CLI
stats=$(gh pr view $pr_number --json files --jq '.files | map(.additions + .deletions) | add')
echo "total_changes=${stats:-0}" >> $GITHUB_OUTPUT
- name: Generate a token
id: generate-token
uses: actions/create-github-app-token@67018539274d69449ef7c02e8e71183d1719ab42 # v2
uses: actions/create-github-app-token@v2
with:
app-id: ${{ secrets.ESPHOME_GITHUB_APP_ID }}
private-key: ${{ secrets.ESPHOME_GITHUB_APP_PRIVATE_KEY }}
- name: Auto Label PR
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
github-token: ${{ steps.generate-token.outputs.token }}
script: |
const fs = require('fs');
// Constants
const SMALL_PR_THRESHOLD = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
const MAX_LABELS = parseInt('${{ env.MAX_LABELS }}');
const TOO_BIG_THRESHOLD = parseInt('${{ env.TOO_BIG_THRESHOLD }}');
const COMPONENT_LABEL_THRESHOLD = parseInt('${{ env.COMPONENT_LABEL_THRESHOLD }}');
const BOT_COMMENT_MARKER = '<!-- auto-label-pr-bot -->';
const CODEOWNERS_MARKER = '<!-- codeowners-request -->';
const TOO_BIG_MARKER = '<!-- too-big-request -->';
const MANAGED_LABELS = [
'new-component',
'new-platform',
'new-target-platform',
'merging-to-release',
'merging-to-beta',
'core',
'small-pr',
'dashboard',
'github-actions',
'by-code-owner',
'has-tests',
'needs-tests',
'needs-docs',
'needs-codeowners',
'too-big',
'labeller-recheck',
'bugfix',
'new-feature',
'breaking-change',
'code-quality'
];
const DOCS_PR_PATTERNS = [
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
/esphome\/esphome-docs#\d+/
];
// Global state
const { owner, repo } = context.repo;
const pr_number = context.issue.number;
// Get current labels and PR data
// Get current labels
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
owner,
repo,
issue_number: pr_number
});
const currentLabels = currentLabelsData.map(label => label.name);
// Define managed labels that this workflow controls
const managedLabels = currentLabels.filter(label =>
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
label.startsWith('component: ') ||
[
'new-component',
'new-platform',
'new-target-platform',
'merging-to-release',
'merging-to-beta',
'core',
'small-pr',
'dashboard',
'github-actions',
'by-code-owner',
'has-tests',
'needs-tests',
'needs-docs',
'too-big',
'labeller-recheck'
].includes(label)
);
// Check for mega-PR early - if present, skip most automatic labeling
const isMegaPR = currentLabels.includes('mega-pr');
// Get all PR files with automatic pagination
const prFiles = await github.paginate(
github.rest.pulls.listFiles,
{
owner,
repo,
pull_number: pr_number
}
);
// Calculate data from PR files
const changedFiles = prFiles.map(file => file.filename);
const totalAdditions = prFiles.reduce((sum, file) => sum + (file.additions || 0), 0);
const totalDeletions = prFiles.reduce((sum, file) => sum + (file.deletions || 0), 0);
const totalChanges = totalAdditions + totalDeletions;
console.log('Current labels:', currentLabels.join(', '));
console.log('Managed labels:', managedLabels.join(', '));
// Get changed files
const changedFiles = `${{ steps.changes.outputs.files }}`.split('\n').filter(f => f.length > 0);
const totalChanges = parseInt('${{ steps.changes.outputs.total_changes }}') || 0;
console.log('Changed files:', changedFiles.length);
console.log('Total changes:', totalChanges);
if (isMegaPR) {
console.log('Mega-PR detected - applying limited labeling logic');
}
// Fetch API data
async function fetchApiData() {
try {
const response = await fetch('https://data.esphome.io/components.json');
const componentsData = await response.json();
return {
targetPlatforms: componentsData.target_platforms || [],
platformComponents: componentsData.platform_components || []
};
} catch (error) {
console.log('Failed to fetch components data from API:', error.message);
return { targetPlatforms: [], platformComponents: [] };
}
}
const labels = new Set();
// Strategy: Merge branch detection
async function detectMergeBranch() {
const labels = new Set();
const baseRef = context.payload.pull_request.base.ref;
// Get environment variables
const targetPlatforms = `${{ env.TARGET_PLATFORMS }}`.split('\n').filter(p => p.trim().length > 0).map(p => p.trim());
const platformComponents = `${{ env.PLATFORM_COMPONENTS }}`.split('\n').filter(p => p.trim().length > 0).map(p => p.trim());
const smallPrThreshold = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
const maxLabels = parseInt('${{ env.MAX_LABELS }}');
// Strategy: Merge to release or beta branch
const baseRef = context.payload.pull_request.base.ref;
if (baseRef !== 'dev') {
if (baseRef === 'release') {
labels.add('merging-to-release');
} else if (baseRef === 'beta') {
labels.add('merging-to-beta');
}
return labels;
}
// Strategy: Component and platform labeling
async function detectComponentPlatforms(apiData) {
const labels = new Set();
const componentRegex = /^esphome\/components\/([^\/]+)\//;
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
for (const file of changedFiles) {
const componentMatch = file.match(componentRegex);
if (componentMatch) {
labels.add(`component: ${componentMatch[1]}`);
}
const platformMatch = file.match(targetPlatformRegex);
if (platformMatch) {
labels.add(`platform: ${platformMatch[1]}`);
}
}
return labels;
}
// Strategy: New component detection
async function detectNewComponents() {
const labels = new Set();
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
for (const file of addedFiles) {
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
if (componentMatch) {
try {
const content = fs.readFileSync(file, 'utf8');
if (content.includes('IS_TARGET_PLATFORM = True')) {
labels.add('new-target-platform');
}
} catch (error) {
console.log(`Failed to read content of ${file}:`, error.message);
}
labels.add('new-component');
}
}
return labels;
}
// Strategy: New platform detection
async function detectNewPlatforms(apiData) {
const labels = new Set();
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
for (const file of addedFiles) {
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
if (platformFileMatch) {
const [, component, platform] = platformFileMatch;
if (apiData.platformComponents.includes(platform)) {
labels.add('new-platform');
}
}
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
if (platformDirMatch) {
const [, component, platform] = platformDirMatch;
if (apiData.platformComponents.includes(platform)) {
labels.add('new-platform');
}
}
}
return labels;
}
// Strategy: Core files detection
async function detectCoreChanges() {
const labels = new Set();
const coreFiles = changedFiles.filter(file =>
file.startsWith('esphome/core/') ||
(file.startsWith('esphome/') && file.split('/').length === 2)
);
if (coreFiles.length > 0) {
labels.add('core');
}
return labels;
}
// Strategy: PR size detection
async function detectPRSize() {
const labels = new Set();
if (totalChanges <= SMALL_PR_THRESHOLD) {
labels.add('small-pr');
return labels;
}
const testAdditions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.additions || 0), 0);
const testDeletions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.deletions || 0), 0);
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
// Don't add too-big if mega-pr label is already present
if (nonTestChanges > TOO_BIG_THRESHOLD && !isMegaPR) {
labels.add('too-big');
}
return labels;
}
// Strategy: Dashboard changes
async function detectDashboardChanges() {
const labels = new Set();
const dashboardFiles = changedFiles.filter(file =>
file.startsWith('esphome/dashboard/') ||
file.startsWith('esphome/components/dashboard_import/')
);
if (dashboardFiles.length > 0) {
labels.add('dashboard');
}
return labels;
}
// Strategy: GitHub Actions changes
async function detectGitHubActionsChanges() {
const labels = new Set();
const githubActionsFiles = changedFiles.filter(file =>
file.startsWith('.github/workflows/')
);
if (githubActionsFiles.length > 0) {
labels.add('github-actions');
}
return labels;
}
// Strategy: Code owner detection
async function detectCodeOwner() {
const labels = new Set();
try {
const { data: codeownersFile } = await github.rest.repos.getContent({
owner,
repo,
path: 'CODEOWNERS',
});
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
const prAuthor = context.payload.pull_request.user.login;
const codeownersLines = codeownersContent.split('\n')
.map(line => line.trim())
.filter(line => line && !line.startsWith('#'));
const codeownersRegexes = codeownersLines.map(line => {
const parts = line.split(/\s+/);
const pattern = parts[0];
const owners = parts.slice(1);
let regex;
if (pattern.endsWith('*')) {
const dir = pattern.slice(0, -1);
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
} else if (pattern.includes('*')) {
// First escape all regex special chars except *, then replace * with .*
const regexPattern = pattern
.replace(/[.+?^${}()|[\]\\]/g, '\\$&')
.replace(/\*/g, '.*');
regex = new RegExp(`^${regexPattern}$`);
} else {
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
}
return { regex, owners };
});
for (const file of changedFiles) {
for (const { regex, owners } of codeownersRegexes) {
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
labels.add('by-code-owner');
return labels;
}
}
}
} catch (error) {
console.log('Failed to read or parse CODEOWNERS file:', error.message);
}
return labels;
}
// Strategy: Test detection
async function detectTests() {
const labels = new Set();
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
if (testFiles.length > 0) {
labels.add('has-tests');
}
return labels;
}
// Strategy: PR Template Checkbox detection
async function detectPRTemplateCheckboxes() {
const labels = new Set();
const prBody = context.payload.pull_request.body || '';
console.log('Checking PR template checkboxes...');
// Check for checked checkboxes in the "Types of changes" section
const checkboxPatterns = [
{ pattern: /- \[x\] Bugfix \(non-breaking change which fixes an issue\)/i, label: 'bugfix' },
{ pattern: /- \[x\] New feature \(non-breaking change which adds functionality\)/i, label: 'new-feature' },
{ pattern: /- \[x\] Breaking change \(fix or feature that would cause existing functionality to not work as expected\)/i, label: 'breaking-change' },
{ pattern: /- \[x\] Code quality improvements to existing code or addition of tests/i, label: 'code-quality' }
];
for (const { pattern, label } of checkboxPatterns) {
if (pattern.test(prBody)) {
console.log(`Found checked checkbox for: ${label}`);
labels.add(label);
}
}
return labels;
}
// Strategy: Requirements detection
async function detectRequirements(allLabels) {
const labels = new Set();
// Check for missing tests
if ((allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) && !allLabels.has('has-tests')) {
labels.add('needs-tests');
}
// Check for missing docs
if (allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) {
const prBody = context.payload.pull_request.body || '';
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
if (!hasDocsLink) {
labels.add('needs-docs');
}
}
// Check for missing CODEOWNERS
if (allLabels.has('new-component')) {
const codeownersModified = prFiles.some(file =>
file.filename === 'CODEOWNERS' &&
(file.status === 'modified' || file.status === 'added') &&
(file.additions || 0) > 0
);
if (!codeownersModified) {
labels.add('needs-codeowners');
}
}
return labels;
}
// Generate review messages
function generateReviewMessages(finalLabels) {
const messages = [];
const prAuthor = context.payload.pull_request.user.login;
// Too big message
if (finalLabels.includes('too-big')) {
const testAdditions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.additions || 0), 0);
const testDeletions = prFiles
.filter(file => file.filename.startsWith('tests/'))
.reduce((sum, file) => sum + (file.deletions || 0), 0);
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
const tooManyLabels = finalLabels.length > MAX_LABELS;
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
if (tooManyLabels && tooManyChanges) {
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${finalLabels.length} different components/areas.`;
} else if (tooManyLabels) {
message += `This PR affects ${finalLabels.length} different components/areas.`;
} else {
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
}
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
messages.push(message);
}
// CODEOWNERS message
if (finalLabels.includes('needs-codeowners')) {
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
`Hey there @${prAuthor},\n` +
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
`This way we can notify you if a bug report for this integration is reported.\n\n` +
`In \`__init__.py\` of the integration, please add:\n\n` +
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
`And run \`script/build_codeowners.py\``;
messages.push(message);
}
return messages;
}
// Handle reviews
async function handleReviews(finalLabels) {
const reviewMessages = generateReviewMessages(finalLabels);
const hasReviewableLabels = finalLabels.some(label =>
['too-big', 'needs-codeowners'].includes(label)
);
const { data: reviews } = await github.rest.pulls.listReviews({
owner,
repo,
pull_number: pr_number
});
const botReviews = reviews.filter(review =>
review.user.type === 'Bot' &&
review.state === 'CHANGES_REQUESTED' &&
review.body && review.body.includes(BOT_COMMENT_MARKER)
);
if (hasReviewableLabels) {
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
if (botReviews.length > 0) {
// Update existing review
await github.rest.pulls.updateReview({
owner,
repo,
pull_number: pr_number,
review_id: botReviews[0].id,
body: reviewBody
});
console.log('Updated existing bot review');
} else {
// Create new review
await github.rest.pulls.createReview({
owner,
repo,
pull_number: pr_number,
body: reviewBody,
event: 'REQUEST_CHANGES'
});
console.log('Created new bot review');
}
} else if (botReviews.length > 0) {
// Dismiss existing reviews
for (const review of botReviews) {
try {
await github.rest.pulls.dismissReview({
owner,
repo,
pull_number: pr_number,
review_id: review.id,
message: 'Review dismissed: All requirements have been met'
});
console.log(`Dismissed bot review ${review.id}`);
} catch (error) {
console.log(`Failed to dismiss review ${review.id}:`, error.message);
}
}
}
}
// Main execution
const apiData = await fetchApiData();
const baseRef = context.payload.pull_request.base.ref;
// Early exit for non-dev branches
if (baseRef !== 'dev') {
const branchLabels = await detectMergeBranch();
const finalLabels = Array.from(branchLabels);
// When targeting non-dev branches, only use merge warning labels
const finalLabels = Array.from(labels);
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
// Apply labels
// Add new labels
if (finalLabels.length > 0) {
console.log(`Adding labels: ${finalLabels.join(', ')}`);
await github.rest.issues.addLabels({
owner,
repo,
@@ -545,9 +172,13 @@ jobs:
});
}
// Remove old managed labels
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
// Remove old managed labels that are no longer needed
const labelsToRemove = managedLabels.filter(label =>
!finalLabels.includes(label)
);
for (const label of labelsToRemove) {
console.log(`Removing label: ${label}`);
try {
await github.rest.issues.removeLabel({
owner,
@@ -560,81 +191,234 @@ jobs:
}
}
return;
return; // Exit early, don't process other strategies
}
// Run all strategies
const [
branchLabels,
componentLabels,
newComponentLabels,
newPlatformLabels,
coreLabels,
sizeLabels,
dashboardLabels,
actionsLabels,
codeOwnerLabels,
testLabels,
checkboxLabels
] = await Promise.all([
detectMergeBranch(),
detectComponentPlatforms(apiData),
detectNewComponents(),
detectNewPlatforms(apiData),
detectCoreChanges(),
detectPRSize(),
detectDashboardChanges(),
detectGitHubActionsChanges(),
detectCodeOwner(),
detectTests(),
detectPRTemplateCheckboxes()
]);
// Strategy: Component and Platform labeling
const componentRegex = /^esphome\/components\/([^\/]+)\//;
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${targetPlatforms.join('|')})/`);
// Combine all labels
const allLabels = new Set([
...branchLabels,
...componentLabels,
...newComponentLabels,
...newPlatformLabels,
...coreLabels,
...sizeLabels,
...dashboardLabels,
...actionsLabels,
...codeOwnerLabels,
...testLabels,
...checkboxLabels
]);
for (const file of changedFiles) {
// Check for component changes
const componentMatch = file.match(componentRegex);
if (componentMatch) {
const component = componentMatch[1];
labels.add(`component: ${component}`);
}
// Detect requirements based on all other labels
const requirementLabels = await detectRequirements(allLabels);
for (const label of requirementLabels) {
allLabels.add(label);
}
let finalLabels = Array.from(allLabels);
// For mega-PRs, exclude component labels if there are too many
if (isMegaPR) {
const componentLabels = finalLabels.filter(label => label.startsWith('component: '));
if (componentLabels.length > COMPONENT_LABEL_THRESHOLD) {
finalLabels = finalLabels.filter(label => !label.startsWith('component: '));
console.log(`Mega-PR detected - excluding ${componentLabels.length} component labels (threshold: ${COMPONENT_LABEL_THRESHOLD})`);
// Check for target platform changes
const platformMatch = file.match(targetPlatformRegex);
if (platformMatch) {
const targetPlatform = platformMatch[1];
labels.add(`platform: ${targetPlatform}`);
}
}
// Handle too many labels (only for non-mega PRs)
const tooManyLabels = finalLabels.length > MAX_LABELS;
// Get PR files for new component/platform detection
const { data: prFiles } = await github.rest.pulls.listFiles({
owner,
repo,
pull_number: pr_number
});
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
finalLabels = ['too-big'];
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
// Strategy: New Component detection
for (const file of addedFiles) {
// Check for new component files: esphome/components/{component}/__init__.py
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
if (componentMatch) {
try {
// Read the content directly from the filesystem since we have it checked out
const content = fs.readFileSync(file, 'utf8');
// Strategy: New Target Platform detection
if (content.includes('IS_TARGET_PLATFORM = True')) {
labels.add('new-target-platform');
}
labels.add('new-component');
} catch (error) {
console.log(`Failed to read content of ${file}:`, error.message);
// Fallback: assume it's a new component if we can't read the content
labels.add('new-component');
}
}
}
// Strategy: New Platform detection
for (const file of addedFiles) {
// Check for new platform files: esphome/components/{component}/{platform}.py
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
if (platformFileMatch) {
const [, component, platform] = platformFileMatch;
if (platformComponents.includes(platform)) {
labels.add('new-platform');
}
}
// Check for new platform files: esphome/components/{component}/{platform}/__init__.py
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
if (platformDirMatch) {
const [, component, platform] = platformDirMatch;
if (platformComponents.includes(platform)) {
labels.add('new-platform');
}
}
}
const coreFiles = changedFiles.filter(file =>
file.startsWith('esphome/core/') ||
(file.startsWith('esphome/') && file.split('/').length === 2)
);
if (coreFiles.length > 0) {
labels.add('core');
}
// Strategy: Small PR detection
if (totalChanges <= smallPrThreshold) {
labels.add('small-pr');
}
// Strategy: Dashboard changes
const dashboardFiles = changedFiles.filter(file =>
file.startsWith('esphome/dashboard/') ||
file.startsWith('esphome/components/dashboard_import/')
);
if (dashboardFiles.length > 0) {
labels.add('dashboard');
}
// Strategy: GitHub Actions changes
const githubActionsFiles = changedFiles.filter(file =>
file.startsWith('.github/workflows/')
);
if (githubActionsFiles.length > 0) {
labels.add('github-actions');
}
// Strategy: Code Owner detection
try {
// Fetch CODEOWNERS file from the repository (in case it was changed in this PR)
const { data: codeownersFile } = await github.rest.repos.getContent({
owner,
repo,
path: 'CODEOWNERS',
});
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
const prAuthor = context.payload.pull_request.user.login;
// Parse CODEOWNERS file
const codeownersLines = codeownersContent.split('\n')
.map(line => line.trim())
.filter(line => line && !line.startsWith('#'));
let isCodeOwner = false;
// Precompile CODEOWNERS patterns into regex objects
const codeownersRegexes = codeownersLines.map(line => {
const parts = line.split(/\s+/);
const pattern = parts[0];
const owners = parts.slice(1);
let regex;
if (pattern.endsWith('*')) {
// Directory pattern like "esphome/components/api/*"
const dir = pattern.slice(0, -1);
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
} else if (pattern.includes('*')) {
// Glob pattern
const regexPattern = pattern
.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
.replace(/\\*/g, '.*');
regex = new RegExp(`^${regexPattern}$`);
} else {
// Exact match
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
}
return { regex, owners };
});
for (const file of changedFiles) {
for (const { regex, owners } of codeownersRegexes) {
if (regex.test(file)) {
// Check if PR author is in the owners list
if (owners.some(owner => owner === `@${prAuthor}`)) {
isCodeOwner = true;
break;
}
}
}
if (isCodeOwner) break;
}
if (isCodeOwner) {
labels.add('by-code-owner');
}
} catch (error) {
console.log('Failed to read or parse CODEOWNERS file:', error.message);
}
// Strategy: Test detection
const testFiles = changedFiles.filter(file =>
file.startsWith('tests/')
);
if (testFiles.length > 0) {
labels.add('has-tests');
} else {
// Only check for needs-tests if this is a new component or new platform
if (labels.has('new-component') || labels.has('new-platform')) {
labels.add('needs-tests');
}
}
// Strategy: Documentation check for new components/platforms
if (labels.has('new-component') || labels.has('new-platform')) {
const prBody = context.payload.pull_request.body || '';
// Look for documentation PR links
// Patterns to match:
// - https://github.com/esphome/esphome-docs/pull/1234
// - esphome/esphome-docs#1234
const docsPrPatterns = [
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
/esphome\/esphome-docs#\d+/
];
const hasDocsLink = docsPrPatterns.some(pattern => pattern.test(prBody));
if (!hasDocsLink) {
labels.add('needs-docs');
}
}
// Convert Set to Array
let finalLabels = Array.from(labels);
console.log('Computed labels:', finalLabels.join(', '));
// Handle reviews
await handleReviews(finalLabels);
// Don't set more than max labels
if (finalLabels.length > maxLabels) {
const originalLength = finalLabels.length;
console.log(`Not setting ${originalLength} labels because out of range`);
finalLabels = ['too-big'];
// Apply labels
// Request changes on the PR
await github.rest.pulls.createReview({
owner,
repo,
pull_number: pr_number,
body: `This PR is too large and affects ${originalLength} different components/areas. Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.`,
event: 'REQUEST_CHANGES'
});
}
// Add new labels
if (finalLabels.length > 0) {
console.log(`Adding labels: ${finalLabels.join(', ')}`);
await github.rest.issues.addLabels({
@@ -645,8 +429,11 @@ jobs:
});
}
// Remove old managed labels
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
// Remove old managed labels that are no longer needed
const labelsToRemove = managedLabels.filter(label =>
!finalLabels.includes(label)
);
for (const label of labelsToRemove) {
console.log(`Removing label: ${label}`);
try {

View File

@@ -21,9 +21,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: "3.11"
@@ -47,7 +47,7 @@ jobs:
fi
- if: failure()
name: Review PR
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
await github.rest.pulls.createReview({
@@ -62,7 +62,7 @@ jobs:
run: git diff
- if: failure()
name: Archive artifacts
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: generated-proto-files
path: |
@@ -70,7 +70,7 @@ jobs:
esphome/components/api/api_pb2_service.*
- if: success()
name: Dismiss review
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
let reviews = await github.rest.pulls.listReviews({

View File

@@ -6,7 +6,6 @@ on:
- ".clang-tidy"
- "platformio.ini"
- "requirements_dev.txt"
- "sdkconfig.defaults"
- ".clang-tidy.hash"
- "script/clang_tidy_hash.py"
- ".github/workflows/ci-clang-tidy-hash.yml"
@@ -21,10 +20,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: "3.11"
@@ -42,7 +41,7 @@ jobs:
- if: failure()
name: Request changes
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
await github.rest.pulls.createReview({
@@ -55,7 +54,7 @@ jobs:
- if: success()
name: Dismiss review
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
let reviews = await github.rest.pulls.listReviews({

View File

@@ -43,13 +43,13 @@ jobs:
- "docker"
# - "lint"
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/checkout@v4.2.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: "3.11"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@v3.11.1
- name: Set TAG
run: |

View File

@@ -36,18 +36,18 @@ jobs:
cache-key: ${{ steps.cache-key.outputs.key }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Generate cache-key
id: cache-key
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt', '.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: ${{ env.DEFAULT_PYTHON }}
- name: Restore Python virtual environment
id: cache-venv
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@v4.2.3
with:
path: venv
# yamllint disable-line rule:line-length
@@ -70,7 +70,7 @@ jobs:
if: needs.determine-jobs.outputs.python-linters == 'true'
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Restore Python
uses: ./.github/actions/restore-python
with:
@@ -91,7 +91,7 @@ jobs:
- common
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Restore Python
uses: ./.github/actions/restore-python
with:
@@ -105,7 +105,6 @@ jobs:
script/ci-custom.py
script/build_codeowners.py --check
script/build_language_schema.py --check
script/generate-esp32-boards.py --check
pytest:
name: Run pytest
@@ -137,7 +136,7 @@ jobs:
- common
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Restore Python
id: restore-python
uses: ./.github/actions/restore-python
@@ -157,12 +156,12 @@ jobs:
. venv/bin/activate
pytest -vv --cov-report=xml --tb=native -n auto tests --ignore=tests/integration/
- name: Upload coverage to Codecov
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@v5.4.3
with:
token: ${{ secrets.CODECOV_TOKEN }}
- name: Save Python virtual environment cache
if: github.ref == 'refs/heads/dev'
uses: actions/cache/save@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache/save@v4.2.3
with:
path: venv
key: ${{ runner.os }}-${{ steps.restore-python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
@@ -177,11 +176,10 @@ jobs:
clang-tidy: ${{ steps.determine.outputs.clang-tidy }}
python-linters: ${{ steps.determine.outputs.python-linters }}
changed-components: ${{ steps.determine.outputs.changed-components }}
changed-components-with-tests: ${{ steps.determine.outputs.changed-components-with-tests }}
component-test-count: ${{ steps.determine.outputs.component-test-count }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
with:
# Fetch enough history to find the merge base
fetch-depth: 2
@@ -205,7 +203,6 @@ jobs:
echo "clang-tidy=$(echo "$output" | jq -r '.clang_tidy')" >> $GITHUB_OUTPUT
echo "python-linters=$(echo "$output" | jq -r '.python_linters')" >> $GITHUB_OUTPUT
echo "changed-components=$(echo "$output" | jq -c '.changed_components')" >> $GITHUB_OUTPUT
echo "changed-components-with-tests=$(echo "$output" | jq -c '.changed_components_with_tests')" >> $GITHUB_OUTPUT
echo "component-test-count=$(echo "$output" | jq -r '.component_test_count')" >> $GITHUB_OUTPUT
integration-tests:
@@ -217,15 +214,15 @@ jobs:
if: needs.determine-jobs.outputs.integration-tests == 'true'
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Set up Python 3.13
id: python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: "3.13"
- name: Restore Python virtual environment
id: cache-venv
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@v4.2.3
with:
path: venv
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
@@ -284,13 +281,13 @@ jobs:
pio_cache_key: tidyesp32-idf
- id: clang-tidy
name: Run script/clang-tidy for ZEPHYR
options: --environment nrf52-tidy --grep USE_ZEPHYR --grep USE_NRF52
options: --environment nrf52-tidy --grep USE_ZEPHYR
pio_cache_key: tidy-zephyr
ignore_errors: false
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
with:
# Need history for HEAD~1 to work for checking changed files
fetch-depth: 2
@@ -303,14 +300,14 @@ jobs:
- name: Cache platformio
if: github.ref == 'refs/heads/dev'
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@v4.2.3
with:
path: ~/.platformio
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
- name: Cache platformio
if: github.ref != 'refs/heads/dev'
uses: actions/cache/restore@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache/restore@v4.2.3
with:
path: ~/.platformio
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
@@ -369,32 +366,31 @@ jobs:
fail-fast: false
max-parallel: 2
matrix:
file: ${{ fromJson(needs.determine-jobs.outputs.changed-components-with-tests) }}
file: ${{ fromJson(needs.determine-jobs.outputs.changed-components) }}
steps:
- name: Cache apt packages
uses: awalsh128/cache-apt-pkgs-action@acb598e5ddbc6f68a970c5da0688d2f3a9f04d05 # v1.5.3
with:
packages: libsdl2-dev
version: 1.0
- name: Install dependencies
run: |
sudo apt-get update
sudo apt-get install libsdl2-dev
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Validate config for ${{ matrix.file }}
- name: test_build_components -e config -c ${{ matrix.file }}
run: |
. venv/bin/activate
python3 script/test_build_components.py -e config -c ${{ matrix.file }}
- name: Compile config for ${{ matrix.file }}
./script/test_build_components -e config -c ${{ matrix.file }}
- name: test_build_components -e compile -c ${{ matrix.file }}
run: |
. venv/bin/activate
python3 script/test_build_components.py -e compile -c ${{ matrix.file }}
./script/test_build_components -e compile -c ${{ matrix.file }}
test-build-components-splitter:
name: Split components for intelligent grouping (40 weighted per batch)
name: Split components for testing into 20 groups maximum
runs-on: ubuntu-24.04
needs:
- common
@@ -404,27 +400,15 @@ jobs:
matrix: ${{ steps.split.outputs.components }}
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Split components intelligently based on bus configurations
uses: actions/checkout@v4.2.2
- name: Split components into 20 groups
id: split
run: |
. venv/bin/activate
# Use intelligent splitter that groups components with same bus configs
components='${{ needs.determine-jobs.outputs.changed-components-with-tests }}'
echo "Splitting components intelligently..."
output=$(python3 script/split_components_for_ci.py --components "$components" --batch-size 40 --output github)
echo "$output" >> $GITHUB_OUTPUT
components=$(echo '${{ needs.determine-jobs.outputs.changed-components }}' | jq -c '.[]' | shuf | jq -s -c '[_nwise(20) | join(" ")]')
echo "components=$components" >> $GITHUB_OUTPUT
test-build-components-split:
name: Test components batch (${{ matrix.components }})
name: Test split components
runs-on: ubuntu-24.04
needs:
- common
@@ -433,62 +417,39 @@ jobs:
if: github.event_name == 'pull_request' && fromJSON(needs.determine-jobs.outputs.component-test-count) >= 100
strategy:
fail-fast: false
max-parallel: ${{ (github.base_ref == 'beta' || github.base_ref == 'release') && 8 || 4 }}
max-parallel: 4
matrix:
components: ${{ fromJson(needs.test-build-components-splitter.outputs.matrix) }}
steps:
- name: Show disk space
run: |
echo "Available disk space:"
df -h
- name: List components
run: echo ${{ matrix.components }}
- name: Cache apt packages
uses: awalsh128/cache-apt-pkgs-action@acb598e5ddbc6f68a970c5da0688d2f3a9f04d05 # v1.5.3
with:
packages: libsdl2-dev
version: 1.0
- name: Install dependencies
run: |
sudo apt-get update
sudo apt-get install libsdl2-dev
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- name: Validate and compile components with intelligent grouping
- name: Validate config
run: |
. venv/bin/activate
# Use /mnt for build files (70GB available vs ~29GB on /)
# Bind mount PlatformIO directory to /mnt (tools, packages, build cache all go there)
sudo mkdir -p /mnt/platformio
sudo chown $USER:$USER /mnt/platformio
mkdir -p ~/.platformio
sudo mount --bind /mnt/platformio ~/.platformio
# Bind mount test build directory to /mnt
sudo mkdir -p /mnt/test_build_components_build
sudo chown $USER:$USER /mnt/test_build_components_build
mkdir -p tests/test_build_components/build
sudo mount --bind /mnt/test_build_components_build tests/test_build_components/build
# Convert space-separated components to comma-separated for Python script
components_csv=$(echo "${{ matrix.components }}" | tr ' ' ',')
echo "Testing components: $components_csv"
echo ""
# Run config validation with grouping
python3 script/test_build_components.py -e config -c "$components_csv" -f
echo ""
echo "Config validation passed! Starting compilation..."
echo ""
# Run compilation with grouping
python3 script/test_build_components.py -e compile -c "$components_csv" -f
for component in ${{ matrix.components }}; do
./script/test_build_components -e config -c $component
done
- name: Compile config
run: |
. venv/bin/activate
mkdir build_cache
export PLATFORMIO_BUILD_CACHE_DIR=$PWD/build_cache
for component in ${{ matrix.components }}; do
./script/test_build_components -e compile -c $component
done
pre-commit-ci-lite:
name: pre-commit.ci lite
@@ -498,16 +459,16 @@ jobs:
if: github.event_name == 'pull_request' && github.base_ref != 'beta' && github.base_ref != 'release'
steps:
- name: Check out code from GitHub
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Restore Python
uses: ./.github/actions/restore-python
with:
python-version: ${{ env.DEFAULT_PYTHON }}
cache-key: ${{ needs.common.outputs.cache-key }}
- uses: esphome/action@43cd1109c09c544d97196f7730ee5b2e0cc6d81e # v3.0.1 fork with pinned actions/cache
- uses: pre-commit/action@v3.0.1
env:
SKIP: pylint,clang-tidy-hash
- uses: pre-commit-ci/lite-action@5d6cc0eb514c891a40562a58a8e71576c5c7fb43 # v1.1.0
- uses: pre-commit-ci/lite-action@v1.1.0
if: always()
ci-status:

View File

@@ -25,7 +25,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Request reviews from component codeowners
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
const owner = context.repo.owner;
@@ -34,9 +34,6 @@ jobs:
console.log(`Processing PR #${pr_number} for codeowner review requests`);
// Hidden marker to identify bot comments from this workflow
const BOT_COMMENT_MARKER = '<!-- codeowner-review-request-bot -->';
try {
// Get the list of changed files in this PR
const { data: files } = await github.rest.pulls.listFiles({
@@ -87,9 +84,9 @@ jobs:
const allMentions = [...reviewerMentions, ...teamMentions].join(', ');
if (isSuccessful) {
return `${BOT_COMMENT_MARKER}\n👋 Hi there! I've automatically requested reviews from codeowners based on the files changed in this PR.\n\n${allMentions} - You've been requested to review this PR as codeowner(s) of ${matchedFileCount} file(s) that were modified. Thanks for your time! 🙏`;
return `👋 Hi there! I've automatically requested reviews from codeowners based on the files changed in this PR.\n\n${allMentions} - You've been requested to review this PR as codeowner(s) of ${matchedFileCount} file(s) that were modified. Thanks for your time! 🙏`;
} else {
return `${BOT_COMMENT_MARKER}\n👋 Hi there! This PR modifies ${matchedFileCount} file(s) with codeowners.\n\n${allMentions} - As codeowner(s) of the affected files, your review would be appreciated! 🙏\n\n_Note: Automatic review request may have failed, but you're still welcome to review._`;
return `👋 Hi there! This PR modifies ${matchedFileCount} file(s) with codeowners.\n\n${allMentions} - As codeowner(s) of the affected files, your review would be appreciated! 🙏\n\n_Note: Automatic review request may have failed, but you're still welcome to review._`;
}
}
@@ -181,53 +178,6 @@ jobs:
reviewedUsers.add(review.user.login);
});
// Check for previous comments from this workflow to avoid duplicate pings
const comments = await github.paginate(
github.rest.issues.listComments,
{
owner,
repo,
issue_number: pr_number
}
);
const previouslyPingedUsers = new Set();
const previouslyPingedTeams = new Set();
// Look for comments from github-actions bot that contain our bot marker
const workflowComments = comments.filter(comment =>
comment.user.type === 'Bot' &&
comment.body.includes(BOT_COMMENT_MARKER)
);
// Extract previously mentioned users and teams from workflow comments
for (const comment of workflowComments) {
// Match @username patterns (not team mentions)
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
userMentions.forEach(mention => {
const username = mention.slice(1); // remove @
previouslyPingedUsers.add(username);
});
// Match @org/team patterns
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/([a-zA-Z0-9_.-]+)/g) || [];
teamMentions.forEach(mention => {
const teamName = mention.split('/')[1];
previouslyPingedTeams.add(teamName);
});
}
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams`);
// Remove users who have already been pinged in previous workflow comments
previouslyPingedUsers.forEach(user => {
matchedOwners.delete(user);
});
previouslyPingedTeams.forEach(team => {
matchedTeams.delete(team);
});
// Remove only users who have already submitted reviews (not just requested reviewers)
reviewedUsers.forEach(reviewer => {
matchedOwners.delete(reviewer);
@@ -242,7 +192,7 @@ jobs:
const teamsList = Array.from(matchedTeams);
if (reviewersList.length === 0 && teamsList.length === 0) {
console.log('No eligible reviewers found (all may already be requested, reviewed, or previously pinged)');
console.log('No eligible reviewers found (all may already be requested or reviewed)');
return;
}
@@ -277,41 +227,31 @@ jobs:
console.log('All codeowners are already requested reviewers or have reviewed');
}
// Only add a comment if there are new codeowners to mention (not previously pinged)
if (reviewersList.length > 0 || teamsList.length > 0) {
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, true);
// Add a comment to the PR mentioning what happened (include all matched codeowners)
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, true);
await github.rest.issues.createComment({
owner,
repo,
issue_number: pr_number,
body: commentBody
});
console.log(`Added comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
} else {
console.log('No new codeowners to mention in comment (all previously pinged)');
}
await github.rest.issues.createComment({
owner,
repo,
issue_number: pr_number,
body: commentBody
});
} catch (error) {
if (error.status === 422) {
console.log('Some reviewers may already be requested or unavailable:', error.message);
// Only try to add a comment if there are new codeowners to mention
if (reviewersList.length > 0 || teamsList.length > 0) {
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, false);
// Try to add a comment even if review request failed
const commentBody = createCommentBody(reviewersList, teamsList, fileMatches.size, false);
try {
await github.rest.issues.createComment({
owner,
repo,
issue_number: pr_number,
body: commentBody
});
console.log(`Added fallback comment mentioning ${reviewersList.length} users and ${teamsList.length} teams`);
} catch (commentError) {
console.log('Failed to add comment:', commentError.message);
}
} else {
console.log('No new codeowners to mention in fallback comment');
try {
await github.rest.issues.createComment({
owner,
repo,
issue_number: pr_number,
body: commentBody
});
} catch (commentError) {
console.log('Failed to add comment:', commentError.message);
}
} else {
throw error;

View File

@@ -54,11 +54,11 @@ jobs:
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
steps:
- name: Checkout repository
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
build-mode: ${{ matrix.build-mode }}
@@ -86,6 +86,6 @@ jobs:
exit 1
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"

View File

@@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Add external component comment
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
@@ -61,8 +61,7 @@ jobs:
}
async function createComment(octokit, owner, repo, prNumber, esphomeChanges, componentChanges) {
const commentMarker = "<!-- This comment was generated automatically by the external-component-bot workflow. -->";
const legacyCommentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
const commentMarker = "<!-- This comment was generated automatically by a GitHub workflow. -->";
let commentBody;
if (esphomeChanges.length === 1) {
commentBody = generateExternalComponentInstructions(prNumber, componentChanges, owner, repo);
@@ -72,23 +71,14 @@ jobs:
commentBody += `\n\n---\n(Added by the PR bot)\n\n${commentMarker}`;
// Check for existing bot comment
const comments = await github.paginate(
github.rest.issues.listComments,
{
owner: owner,
repo: repo,
issue_number: prNumber,
per_page: 100,
}
);
const comments = await github.rest.issues.listComments({
owner: owner,
repo: repo,
issue_number: prNumber,
});
const sorted = comments.sort((a, b) => new Date(b.updated_at) - new Date(a.updated_at));
const botComment = sorted.find(comment =>
(
comment.body.includes(commentMarker) ||
comment.body.includes(legacyCommentMarker)
) && comment.user.type === "Bot"
const botComment = comments.data.find(comment =>
comment.body.includes(commentMarker)
);
if (botComment && botComment.body === commentBody) {

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Notify codeowners for component issues
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
script: |
const owner = context.repo.owner;
@@ -29,9 +29,6 @@ jobs:
console.log(`Processing issue #${issue_number} with label: ${labelName}`);
// Hidden marker to identify bot comments from this workflow
const BOT_COMMENT_MARKER = '<!-- issue-codeowner-notify-bot -->';
// Extract component name from label
const componentName = labelName.replace('component: ', '');
console.log(`Component: ${componentName}`);
@@ -95,57 +92,16 @@ jobs:
mention !== `@${issueAuthor}`
);
// Check for previous comments from this workflow to avoid duplicate pings
const comments = await github.paginate(
github.rest.issues.listComments,
{
owner,
repo,
issue_number: issue_number
}
);
const previouslyPingedUsers = new Set();
const previouslyPingedTeams = new Set();
// Look for comments from github-actions bot that contain codeowner pings for this component
const workflowComments = comments.filter(comment =>
comment.user.type === 'Bot' &&
comment.body.includes(BOT_COMMENT_MARKER) &&
comment.body.includes(`component: ${componentName}`)
);
// Extract previously mentioned users and teams from workflow comments
for (const comment of workflowComments) {
// Match @username patterns (not team mentions)
const userMentions = comment.body.match(/@([a-zA-Z0-9_.-]+)(?![/])/g) || [];
userMentions.forEach(mention => {
previouslyPingedUsers.add(mention); // Keep @ prefix for easy comparison
});
// Match @org/team patterns
const teamMentions = comment.body.match(/@[a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+/g) || [];
teamMentions.forEach(mention => {
previouslyPingedTeams.add(mention);
});
}
console.log(`Found ${previouslyPingedUsers.size} previously pinged users and ${previouslyPingedTeams.size} previously pinged teams for component ${componentName}`);
// Remove previously pinged users and teams
const newUserOwners = filteredUserOwners.filter(mention => !previouslyPingedUsers.has(mention));
const newTeamOwners = teamOwners.filter(mention => !previouslyPingedTeams.has(mention));
const allMentions = [...newUserOwners, ...newTeamOwners];
const allMentions = [...filteredUserOwners, ...teamOwners];
if (allMentions.length === 0) {
console.log('No new codeowners to notify (all previously pinged or issue author is the only codeowner)');
console.log('No codeowners to notify (issue author is the only codeowner)');
return;
}
// Create comment body
const mentionString = allMentions.join(', ');
const commentBody = `${BOT_COMMENT_MARKER}\n👋 Hey ${mentionString}!\n\nThis issue has been labeled with \`component: ${componentName}\` and you've been identified as a codeowner of this component. Please take a look when you have a chance!\n\nThanks for maintaining this component! 🙏`;
const commentBody = `👋 Hey ${mentionString}!\n\nThis issue has been labeled with \`component: ${componentName}\` and you've been identified as a codeowner of this component. Please take a look when you have a chance!\n\nThanks for maintaining this component! 🙏`;
// Post comment
await github.rest.issues.createComment({
@@ -155,7 +111,7 @@ jobs:
body: commentBody
});
console.log(`Successfully notified new codeowners: ${mentionString}`);
console.log(`Successfully notified codeowners: ${mentionString}`);
} catch (error) {
console.log('Failed to process codeowner notifications:', error.message);

24
.github/workflows/needs-docs.yml vendored Normal file
View File

@@ -0,0 +1,24 @@
name: Needs Docs
on:
pull_request:
types: [labeled, unlabeled]
jobs:
check:
name: Check
runs-on: ubuntu-latest
steps:
- name: Check for needs-docs label
uses: actions/github-script@v7.0.1
with:
script: |
const { data: labels } = await github.rest.issues.listLabelsOnIssue({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number
});
const needsDocs = labels.find(label => label.name === 'needs-docs');
if (needsDocs) {
core.setFailed('Pull request needs docs');
}

View File

@@ -20,7 +20,7 @@ jobs:
branch_build: ${{ steps.tag.outputs.branch_build }}
deploy_env: ${{ steps.tag.outputs.deploy_env }}
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/checkout@v4.2.2
- name: Get tag
id: tag
# yamllint disable rule:line-length
@@ -60,9 +60,9 @@ jobs:
contents: read
id-token: write
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/checkout@v4.2.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: "3.x"
- name: Build
@@ -70,7 +70,7 @@ jobs:
pip3 install build
python3 -m build
- name: Publish
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
uses: pypa/gh-action-pypi-publish@v1.12.4
with:
skip-existing: true
@@ -92,22 +92,22 @@ jobs:
os: "ubuntu-24.04-arm"
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/checkout@v4.2.2
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: "3.11"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@v3.11.1
- name: Log in to docker hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.4.0
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Log in to the GitHub container registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.4.0
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -138,7 +138,7 @@ jobs:
# version: ${{ needs.init.outputs.tag }}
- name: Upload digests
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4.6.2
with:
name: digests-${{ matrix.platform.arch }}
path: /tmp/digests
@@ -168,27 +168,27 @@ jobs:
- ghcr
- dockerhub
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: actions/checkout@v4.2.2
- name: Download digests
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
uses: actions/download-artifact@v4.3.0
with:
pattern: digests-*
path: /tmp/digests
merge-multiple: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@v3.11.1
- name: Log in to docker hub
if: matrix.registry == 'dockerhub'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.4.0
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Log in to the GitHub container registry
if: matrix.registry == 'ghcr'
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
uses: docker/login-action@v3.4.0
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -220,7 +220,7 @@ jobs:
- deploy-manifest
steps:
- name: Trigger Workflow
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
github-token: ${{ secrets.DEPLOY_HA_ADDON_REPO_TOKEN }}
script: |
@@ -246,7 +246,7 @@ jobs:
environment: ${{ needs.init.outputs.deploy_env }}
steps:
- name: Trigger Workflow
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
uses: actions/github-script@v7.0.1
with:
github-token: ${{ secrets.DEPLOY_ESPHOME_SCHEMA_REPO_TOKEN }}
script: |

View File

@@ -15,52 +15,36 @@ concurrency:
jobs:
stale:
if: github.repository_owner == 'esphome'
runs-on: ubuntu-latest
steps:
- name: Stale
uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
- uses: actions/stale@v9.1.0
with:
debug-only: ${{ github.ref != 'refs/heads/dev' }} # Dry-run when not run on dev branch
remove-stale-when-updated: true
operations-per-run: 150
# The 90 day stale policy for PRs
# - PRs
# - No PRs marked as "not-stale"
# - No Issues (see below)
days-before-pr-stale: 90
days-before-pr-close: 7
days-before-issue-stale: -1
days-before-issue-close: -1
remove-stale-when-updated: true
stale-pr-label: "stale"
exempt-pr-labels: "not-stale"
stale-pr-message: >
There hasn't been any activity on this pull request recently. This
pull request has been automatically marked as stale because of that
and will be closed if no further activity occurs within 7 days.
Thank you for your contributions.
If you are the author of this PR, please leave a comment if you want
to keep it open. Also, please rebase your PR onto the latest dev
branch to ensure that it's up to date with the latest changes.
Thank you for your contribution!
# The 90 day stale policy for Issues
# - Issues
# - No Issues marked as "not-stale"
# - No PRs (see above)
days-before-issue-stale: 90
days-before-issue-close: 7
# Use stale to automatically close issues with a
# reference to the issue tracker
close-issues:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v9.1.0
with:
days-before-pr-stale: -1
days-before-pr-close: -1
days-before-issue-stale: 1
days-before-issue-close: 1
remove-stale-when-updated: true
stale-issue-label: "stale"
exempt-issue-labels: "not-stale"
stale-issue-message: >
There hasn't been any activity on this issue recently. Due to the
high number of incoming GitHub notifications, we have to clean some
of the old issues, as many of them have already been resolved with
the latest updates.
Please make sure to update to the latest ESPHome version and
check if that solves the issue. Let us know if that works for you by
adding a comment 👍
This issue has now been marked as stale and will be closed if no
further activity occurs. Thank you for your contributions.
https://github.com/esphome/esphome/issues/430

View File

@@ -1,30 +0,0 @@
name: Status check labels
on:
pull_request:
types: [labeled, unlabeled]
jobs:
check:
name: Check ${{ matrix.label }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
label:
- needs-docs
- merge-after-release
steps:
- name: Check for ${{ matrix.label }} label
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
script: |
const { data: labels } = await github.rest.issues.listLabelsOnIssue({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number
});
const hasLabel = labels.find(label => label.name === '${{ matrix.label }}');
if (hasLabel) {
core.setFailed('Pull request cannot be merged, it is labeled as ${{ matrix.label }}');
}

View File

@@ -13,16 +13,16 @@ jobs:
if: github.repository == 'esphome/esphome'
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
- name: Checkout Home Assistant
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
uses: actions/checkout@v4.2.2
with:
repository: home-assistant/core
path: lib/home-assistant
- name: Setup Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
uses: actions/setup-python@v5.6.0
with:
python-version: 3.13
@@ -30,18 +30,13 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install -e lib/home-assistant
pip install -r requirements_test.txt pre-commit
- name: Sync
run: |
python ./script/sync-device_class.py
- name: Run pre-commit hooks
run: |
python script/run-in-env.py pre-commit run --all-files
- name: Commit changes
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@v7.0.8
with:
commit-message: "Synchronise Device Classes from Home Assistant"
committer: esphomebot <esphome@openhomefoundation.org>

View File

@@ -11,7 +11,7 @@ ci:
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.14.0
rev: v0.12.4
hooks:
# Run the linter.
- id: ruff

View File

@@ -40,11 +40,11 @@ esphome/components/analog_threshold/* @ianchi
esphome/components/animation/* @syndlex
esphome/components/anova/* @buxtronix
esphome/components/apds9306/* @aodrenah
esphome/components/api/* @esphome/core
esphome/components/api/* @OttoWinter
esphome/components/as5600/* @ammmze
esphome/components/as5600/sensor/* @ammmze
esphome/components/as7341/* @mrgnr
esphome/components/async_tcp/* @esphome/core
esphome/components/async_tcp/* @OttoWinter
esphome/components/at581x/* @X-Ryl669
esphome/components/atc_mithermometer/* @ahpohl
esphome/components/atm90e26/* @danieltwagner
@@ -66,10 +66,10 @@ esphome/components/binary_sensor/* @esphome/core
esphome/components/bk72xx/* @kuba2k2
esphome/components/bl0906/* @athom-tech @jesserockz @tarontop
esphome/components/bl0939/* @ziceva
esphome/components/bl0940/* @dan-s-github @tobias-
esphome/components/bl0940/* @tobias-
esphome/components/bl0942/* @dbuezas @dwmw2
esphome/components/ble_client/* @buxtronix @clydebarrow
esphome/components/bluetooth_proxy/* @bdraco @jesserockz
esphome/components/bluetooth_proxy/* @jesserockz
esphome/components/bme280_base/* @esphome/core
esphome/components/bme280_spi/* @apbodrov
esphome/components/bme680_bsec/* @trvrnrth
@@ -88,11 +88,10 @@ esphome/components/bp1658cj/* @Cossid
esphome/components/bp5758d/* @Cossid
esphome/components/button/* @esphome/core
esphome/components/bytebuffer/* @clydebarrow
esphome/components/camera/* @bdraco @DT-art1
esphome/components/camera_encoder/* @DT-art1
esphome/components/camera/* @DT-art1 @bdraco
esphome/components/canbus/* @danielschramm @mvturnho
esphome/components/cap1188/* @mreditor97
esphome/components/captive_portal/* @esphome/core
esphome/components/captive_portal/* @OttoWinter
esphome/components/ccs811/* @habbie
esphome/components/cd74hc4067/* @asoehlke
esphome/components/ch422g/* @clydebarrow @jesterret
@@ -119,7 +118,7 @@ esphome/components/dallas_temp/* @ssieb
esphome/components/daly_bms/* @s1lvi0
esphome/components/dashboard_import/* @esphome/core
esphome/components/datetime/* @jesserockz @rfdarter
esphome/components/debug/* @esphome/core
esphome/components/debug/* @OttoWinter
esphome/components/delonghi/* @grob6000
esphome/components/dfplayer/* @glmnet
esphome/components/dfrobot_sen0395/* @niklasweber
@@ -139,17 +138,15 @@ esphome/components/ens160_base/* @latonita @vincentscode
esphome/components/ens160_i2c/* @latonita
esphome/components/ens160_spi/* @latonita
esphome/components/ens210/* @itn3rd77
esphome/components/epaper_spi/* @esphome/core
esphome/components/es7210/* @kahrendt
esphome/components/es7243e/* @kbx81
esphome/components/es8156/* @kbx81
esphome/components/es8311/* @kahrendt @kroimon
esphome/components/es8388/* @P4uLT
esphome/components/esp32/* @esphome/core
esphome/components/esp32_ble/* @bdraco @jesserockz @Rapsssito
esphome/components/esp32_ble_client/* @bdraco @jesserockz
esphome/components/esp32_ble_server/* @clydebarrow @jesserockz @Rapsssito
esphome/components/esp32_ble_tracker/* @bdraco
esphome/components/esp32_ble/* @Rapsssito @jesserockz
esphome/components/esp32_ble_client/* @jesserockz
esphome/components/esp32_ble_server/* @Rapsssito @clydebarrow @jesserockz
esphome/components/esp32_camera_web_server/* @ayufan
esphome/components/esp32_can/* @Sympatron
esphome/components/esp32_hosted/* @swoboda1337
@@ -158,16 +155,16 @@ esphome/components/esp32_rmt/* @jesserockz
esphome/components/esp32_rmt_led_strip/* @jesserockz
esphome/components/esp8266/* @esphome/core
esphome/components/esp_ldo/* @clydebarrow
esphome/components/espnow/* @jesserockz
esphome/components/ethernet_info/* @gtjadsonsantos
esphome/components/event/* @nohat
esphome/components/event_emitter/* @Rapsssito
esphome/components/exposure_notifications/* @OttoWinter
esphome/components/ezo/* @ssieb
esphome/components/ezo_pmp/* @carlos-sarmiento
esphome/components/factory_reset/* @anatoly-savchenkov
esphome/components/fastled_base/* @OttoWinter
esphome/components/feedback/* @ianchi
esphome/components/fingerprint_grow/* @alexborro @loongyh @OnFreund
esphome/components/fingerprint_grow/* @OnFreund @alexborro @loongyh
esphome/components/font/* @clydebarrow @esphome/core
esphome/components/fs3000/* @kahrendt
esphome/components/ft5x06/* @clydebarrow
@@ -203,7 +200,7 @@ esphome/components/heatpumpir/* @rob-deutsch
esphome/components/hitachi_ac424/* @sourabhjaiswal
esphome/components/hm3301/* @freekode
esphome/components/hmac_md5/* @dwmw2
esphome/components/homeassistant/* @esphome/core @OttoWinter
esphome/components/homeassistant/* @OttoWinter @esphome/core
esphome/components/homeassistant/number/* @landonr
esphome/components/homeassistant/switch/* @Links2004
esphome/components/honeywell_hih_i2c/* @Benichou34
@@ -228,18 +225,18 @@ esphome/components/iaqcore/* @yozik04
esphome/components/ili9xxx/* @clydebarrow @nielsnl68
esphome/components/improv_base/* @esphome/core
esphome/components/improv_serial/* @esphome/core
esphome/components/ina226/* @latonita @Sergio303
esphome/components/ina226/* @Sergio303 @latonita
esphome/components/ina260/* @mreditor97
esphome/components/ina2xx_base/* @latonita
esphome/components/ina2xx_i2c/* @latonita
esphome/components/ina2xx_spi/* @latonita
esphome/components/inkbird_ibsth1_mini/* @fkirill
esphome/components/inkplate/* @jesserockz @JosipKuci
esphome/components/inkplate6/* @jesserockz
esphome/components/integration/* @OttoWinter
esphome/components/internal_temperature/* @Mat931
esphome/components/interval/* @esphome/core
esphome/components/jsn_sr04t/* @Mafus1
esphome/components/json/* @esphome/core
esphome/components/json/* @OttoWinter
esphome/components/kamstrup_kmp/* @cfeenstra1024
esphome/components/key_collector/* @ssieb
esphome/components/key_provider/* @ssieb
@@ -247,17 +244,14 @@ esphome/components/kuntze/* @ssieb
esphome/components/lc709203f/* @ilikecake
esphome/components/lcd_menu/* @numo68
esphome/components/ld2410/* @regevbr @sebcaps
esphome/components/ld2412/* @Rihan9
esphome/components/ld2420/* @descipher
esphome/components/ld2450/* @hareeshmu
esphome/components/ld24xx/* @kbx81
esphome/components/ledc/* @OttoWinter
esphome/components/libretiny/* @kuba2k2
esphome/components/libretiny_pwm/* @kuba2k2
esphome/components/light/* @esphome/core
esphome/components/lightwaverf/* @max246
esphome/components/lilygo_t5_47/touchscreen/* @jesserockz
esphome/components/lm75b/* @beormund
esphome/components/ln882x/* @lamauny
esphome/components/lock/* @esphome/core
esphome/components/logger/* @esphome/core
@@ -278,8 +272,8 @@ esphome/components/max7219digit/* @rspaargaren
esphome/components/max9611/* @mckaymatthew
esphome/components/mcp23008/* @jesserockz
esphome/components/mcp23017/* @jesserockz
esphome/components/mcp23s08/* @jesserockz @SenexCrenshaw
esphome/components/mcp23s17/* @jesserockz @SenexCrenshaw
esphome/components/mcp23s08/* @SenexCrenshaw @jesserockz
esphome/components/mcp23s17/* @SenexCrenshaw @jesserockz
esphome/components/mcp23x08_base/* @jesserockz
esphome/components/mcp23x17_base/* @jesserockz
esphome/components/mcp23xxx_base/* @jesserockz
@@ -299,8 +293,6 @@ esphome/components/microphone/* @jesserockz @kahrendt
esphome/components/mics_4514/* @jesserockz
esphome/components/midea/* @dudanov
esphome/components/midea_ir/* @dudanov
esphome/components/mipi_dsi/* @clydebarrow
esphome/components/mipi_rgb/* @clydebarrow
esphome/components/mipi_spi/* @clydebarrow
esphome/components/mitsubishi/* @RubyBailey
esphome/components/mixer/speaker/* @kahrendt
@@ -344,7 +336,7 @@ esphome/components/ota/* @esphome/core
esphome/components/output/* @esphome/core
esphome/components/packet_transport/* @clydebarrow
esphome/components/pca6416a/* @Mat931
esphome/components/pca9554/* @bdraco @clydebarrow @hwstar
esphome/components/pca9554/* @clydebarrow @hwstar
esphome/components/pcf85063/* @brogon
esphome/components/pcf8563/* @KoenBreeman
esphome/components/pi4ioe5v6408/* @jesserockz
@@ -355,9 +347,9 @@ esphome/components/pm2005/* @andrewjswan
esphome/components/pmsa003i/* @sjtrny
esphome/components/pmsx003/* @ximex
esphome/components/pmwcs3/* @SeByDocKy
esphome/components/pn532/* @jesserockz @OttoWinter
esphome/components/pn532_i2c/* @jesserockz @OttoWinter
esphome/components/pn532_spi/* @jesserockz @OttoWinter
esphome/components/pn532/* @OttoWinter @jesserockz
esphome/components/pn532_i2c/* @OttoWinter @jesserockz
esphome/components/pn532_spi/* @OttoWinter @jesserockz
esphome/components/pn7150/* @jesserockz @kbx81
esphome/components/pn7150_i2c/* @jesserockz @kbx81
esphome/components/pn7160/* @jesserockz @kbx81
@@ -366,7 +358,7 @@ esphome/components/pn7160_spi/* @jesserockz @kbx81
esphome/components/power_supply/* @esphome/core
esphome/components/preferences/* @esphome/core
esphome/components/psram/* @esphome/core
esphome/components/pulse_meter/* @cstaahl @stevebaxter @TrentHouliston
esphome/components/pulse_meter/* @TrentHouliston @cstaahl @stevebaxter
esphome/components/pvvx_mithermometer/* @pasiz
esphome/components/pylontech/* @functionpointer
esphome/components/qmp6988/* @andrewpc
@@ -407,8 +399,7 @@ esphome/components/sensirion_common/* @martgras
esphome/components/sensor/* @esphome/core
esphome/components/sfa30/* @ghsensdev
esphome/components/sgp40/* @SenexCrenshaw
esphome/components/sgp4x/* @martgras @SenexCrenshaw
esphome/components/sha256/* @esphome/core
esphome/components/sgp4x/* @SenexCrenshaw @martgras
esphome/components/shelly_dimmer/* @edge90 @rnauber
esphome/components/sht3xd/* @mrtoy-me
esphome/components/sht4x/* @sjtrny
@@ -430,7 +421,6 @@ esphome/components/speaker/media_player/* @kahrendt @synesthesiam
esphome/components/spi/* @clydebarrow @esphome/core
esphome/components/spi_device/* @clydebarrow
esphome/components/spi_led_strip/* @clydebarrow
esphome/components/split_buffer/* @jesserockz
esphome/components/sprinkler/* @kbx81
esphome/components/sps30/* @martgras
esphome/components/ssd1322_base/* @kbx81
@@ -474,13 +464,13 @@ esphome/components/template/event/* @nohat
esphome/components/template/fan/* @ssieb
esphome/components/text/* @mauritskorse
esphome/components/thermostat/* @kbx81
esphome/components/time/* @esphome/core
esphome/components/time/* @OttoWinter
esphome/components/tlc5947/* @rnauber
esphome/components/tlc5971/* @IJIJI
esphome/components/tm1621/* @Philippe12
esphome/components/tm1637/* @glmnet
esphome/components/tm1638/* @skykingjwc
esphome/components/tm1651/* @mrtoy-me
esphome/components/tm1651/* @freekode
esphome/components/tmp102/* @timsavage
esphome/components/tmp1075/* @sybrenstuvel
esphome/components/tmp117/* @Azimath
@@ -518,7 +508,7 @@ esphome/components/wake_on_lan/* @clydebarrow @willwill2will54
esphome/components/watchdog/* @oarcher
esphome/components/waveshare_epaper/* @clydebarrow
esphome/components/web_server/ota/* @esphome/core
esphome/components/web_server_base/* @esphome/core
esphome/components/web_server_base/* @OttoWinter
esphome/components/web_server_idf/* @dentra
esphome/components/weikai/* @DrCoolZic
esphome/components/weikai_i2c/* @DrCoolZic
@@ -536,7 +526,6 @@ esphome/components/wk2204_spi/* @DrCoolZic
esphome/components/wk2212_i2c/* @DrCoolZic
esphome/components/wk2212_spi/* @DrCoolZic
esphome/components/wl_134/* @hobbypunk90
esphome/components/wts01/* @alepee
esphome/components/x9c/* @EtienneMD
esphome/components/xgzp68xx/* @gcormier
esphome/components/xiaomi_hhccjcy10/* @fariouche
@@ -552,4 +541,3 @@ esphome/components/xxtea/* @clydebarrow
esphome/components/zephyr/* @tomaszduda23
esphome/components/zhlt01/* @cfeenstra1024
esphome/components/zio_ultrasonic/* @kahrendt
esphome/components/zwave_proxy/* @kbx81

View File

@@ -48,7 +48,7 @@ PROJECT_NAME = ESPHome
# could be handy for archiving the generated documentation or if some version
# control system is used.
PROJECT_NUMBER = 2025.10.1
PROJECT_NUMBER = 2025.8.0-dev
# Using the PROJECT_BRIEF tag one can provide an optional one line description
# for a project that appears at the top of each page and should give viewer a

View File

@@ -90,7 +90,7 @@ def main():
def run_command(*cmd, ignore_error: bool = False):
print(f"$ {shlex.join(list(cmd))}")
if not args.dry_run:
rc = subprocess.call(list(cmd), close_fds=False)
rc = subprocess.call(list(cmd))
if rc != 0 and not ignore_error:
print("Command failed")
sys.exit(1)

File diff suppressed because it is too large Load Diff

View File

@@ -1,142 +0,0 @@
"""Address cache for DNS and mDNS lookups."""
from __future__ import annotations
import logging
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from collections.abc import Iterable
_LOGGER = logging.getLogger(__name__)
def normalize_hostname(hostname: str) -> str:
"""Normalize hostname for cache lookups.
Removes trailing dots and converts to lowercase.
"""
return hostname.rstrip(".").lower()
class AddressCache:
"""Cache for DNS and mDNS address lookups.
This cache stores pre-resolved addresses from command-line arguments
to avoid slow DNS/mDNS lookups during builds.
"""
def __init__(
self,
mdns_cache: dict[str, list[str]] | None = None,
dns_cache: dict[str, list[str]] | None = None,
) -> None:
"""Initialize the address cache.
Args:
mdns_cache: Pre-populated mDNS addresses (hostname -> IPs)
dns_cache: Pre-populated DNS addresses (hostname -> IPs)
"""
self.mdns_cache = mdns_cache or {}
self.dns_cache = dns_cache or {}
def _get_cached_addresses(
self, hostname: str, cache: dict[str, list[str]], cache_type: str
) -> list[str] | None:
"""Get cached addresses from a specific cache.
Args:
hostname: The hostname to look up
cache: The cache dictionary to check
cache_type: Type of cache for logging ("mDNS" or "DNS")
Returns:
List of IP addresses if found in cache, None otherwise
"""
normalized = normalize_hostname(hostname)
if addresses := cache.get(normalized):
_LOGGER.debug("Using %s cache for %s: %s", cache_type, hostname, addresses)
return addresses
return None
def get_mdns_addresses(self, hostname: str) -> list[str] | None:
"""Get cached mDNS addresses for a hostname.
Args:
hostname: The hostname to look up (should end with .local)
Returns:
List of IP addresses if found in cache, None otherwise
"""
return self._get_cached_addresses(hostname, self.mdns_cache, "mDNS")
def get_dns_addresses(self, hostname: str) -> list[str] | None:
"""Get cached DNS addresses for a hostname.
Args:
hostname: The hostname to look up
Returns:
List of IP addresses if found in cache, None otherwise
"""
return self._get_cached_addresses(hostname, self.dns_cache, "DNS")
def get_addresses(self, hostname: str) -> list[str] | None:
"""Get cached addresses for a hostname.
Checks mDNS cache for .local domains, DNS cache otherwise.
Args:
hostname: The hostname to look up
Returns:
List of IP addresses if found in cache, None otherwise
"""
normalized = normalize_hostname(hostname)
if normalized.endswith(".local"):
return self.get_mdns_addresses(hostname)
return self.get_dns_addresses(hostname)
def has_cache(self) -> bool:
"""Check if any cache entries exist."""
return bool(self.mdns_cache or self.dns_cache)
@classmethod
def from_cli_args(
cls, mdns_args: Iterable[str], dns_args: Iterable[str]
) -> AddressCache:
"""Create cache from command-line arguments.
Args:
mdns_args: List of mDNS cache entries like ['host=ip1,ip2']
dns_args: List of DNS cache entries like ['host=ip1,ip2']
Returns:
Configured AddressCache instance
"""
mdns_cache = cls._parse_cache_args(mdns_args)
dns_cache = cls._parse_cache_args(dns_args)
return cls(mdns_cache=mdns_cache, dns_cache=dns_cache)
@staticmethod
def _parse_cache_args(cache_args: Iterable[str]) -> dict[str, list[str]]:
"""Parse cache arguments into a dictionary.
Args:
cache_args: List of cache mappings like ['host1=ip1,ip2', 'host2=ip3']
Returns:
Dictionary mapping normalized hostnames to list of IP addresses
"""
cache: dict[str, list[str]] = {}
for arg in cache_args:
if "=" not in arg:
_LOGGER.warning(
"Invalid cache format: %s (expected 'hostname=ip1,ip2')", arg
)
continue
hostname, ips = arg.split("=", 1)
# Normalize hostname for consistent lookups
normalized = normalize_hostname(hostname)
cache[normalized] = [ip.strip() for ip in ips.split(",")]
return cache

View File

@@ -15,10 +15,7 @@ from esphome.const import (
CONF_TYPE_ID,
CONF_UPDATE_INTERVAL,
)
from esphome.core import ID
from esphome.cpp_generator import MockObj, MockObjClass, TemplateArgsType
from esphome.schema_extractors import SCHEMA_EXTRACT, schema_extractor
from esphome.types import ConfigType
from esphome.util import Registry
@@ -52,11 +49,11 @@ def maybe_conf(conf, *validators):
return validate
def register_action(name: str, action_type: MockObjClass, schema: cv.Schema):
def register_action(name, action_type, schema):
return ACTION_REGISTRY.register(name, action_type, schema)
def register_condition(name: str, condition_type: MockObjClass, schema: cv.Schema):
def register_condition(name, condition_type, schema):
return CONDITION_REGISTRY.register(name, condition_type, schema)
@@ -167,78 +164,43 @@ XorCondition = cg.esphome_ns.class_("XorCondition", Condition)
@register_condition("and", AndCondition, validate_condition_list)
async def and_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def and_condition_to_code(config, condition_id, template_arg, args):
conditions = await build_condition_list(config, template_arg, args)
return cg.new_Pvariable(condition_id, template_arg, conditions)
@register_condition("or", OrCondition, validate_condition_list)
async def or_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def or_condition_to_code(config, condition_id, template_arg, args):
conditions = await build_condition_list(config, template_arg, args)
return cg.new_Pvariable(condition_id, template_arg, conditions)
@register_condition("all", AndCondition, validate_condition_list)
async def all_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def all_condition_to_code(config, condition_id, template_arg, args):
conditions = await build_condition_list(config, template_arg, args)
return cg.new_Pvariable(condition_id, template_arg, conditions)
@register_condition("any", OrCondition, validate_condition_list)
async def any_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def any_condition_to_code(config, condition_id, template_arg, args):
conditions = await build_condition_list(config, template_arg, args)
return cg.new_Pvariable(condition_id, template_arg, conditions)
@register_condition("not", NotCondition, validate_potentially_and_condition)
async def not_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def not_condition_to_code(config, condition_id, template_arg, args):
condition = await build_condition(config, template_arg, args)
return cg.new_Pvariable(condition_id, template_arg, condition)
@register_condition("xor", XorCondition, validate_condition_list)
async def xor_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def xor_condition_to_code(config, condition_id, template_arg, args):
conditions = await build_condition_list(config, template_arg, args)
return cg.new_Pvariable(condition_id, template_arg, conditions)
@register_condition("lambda", LambdaCondition, cv.returning_lambda)
async def lambda_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def lambda_condition_to_code(config, condition_id, template_arg, args):
lambda_ = await cg.process_lambda(config, args, return_type=bool)
return cg.new_Pvariable(condition_id, template_arg, lambda_)
@@ -255,12 +217,7 @@ async def lambda_condition_to_code(
}
).extend(cv.COMPONENT_SCHEMA),
)
async def for_condition_to_code(
config: ConfigType,
condition_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def for_condition_to_code(config, condition_id, template_arg, args):
condition = await build_condition(
config[CONF_CONDITION], cg.TemplateArguments(), []
)
@@ -274,12 +231,7 @@ async def for_condition_to_code(
@register_action(
"delay", DelayAction, cv.templatable(cv.positive_time_period_milliseconds)
)
async def delay_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def delay_action_to_code(config, action_id, template_arg, args):
var = cg.new_Pvariable(action_id, template_arg)
await cg.register_component(var, {})
template_ = await cg.templatable(config, args, cg.uint32)
@@ -304,15 +256,10 @@ async def delay_action_to_code(
cv.has_at_least_one_key(CONF_CONDITION, CONF_ANY, CONF_ALL),
),
)
async def if_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def if_action_to_code(config, action_id, template_arg, args):
cond_conf = next(el for el in config if el in (CONF_ANY, CONF_ALL, CONF_CONDITION))
condition = await build_condition(config[cond_conf], template_arg, args)
var = cg.new_Pvariable(action_id, template_arg, condition)
conditions = await build_condition(config[cond_conf], template_arg, args)
var = cg.new_Pvariable(action_id, template_arg, conditions)
if CONF_THEN in config:
actions = await build_action_list(config[CONF_THEN], template_arg, args)
cg.add(var.add_then(actions))
@@ -332,14 +279,9 @@ async def if_action_to_code(
}
),
)
async def while_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
condition = await build_condition(config[CONF_CONDITION], template_arg, args)
var = cg.new_Pvariable(action_id, template_arg, condition)
async def while_action_to_code(config, action_id, template_arg, args):
conditions = await build_condition(config[CONF_CONDITION], template_arg, args)
var = cg.new_Pvariable(action_id, template_arg, conditions)
actions = await build_action_list(config[CONF_THEN], template_arg, args)
cg.add(var.add_then(actions))
return var
@@ -355,12 +297,7 @@ async def while_action_to_code(
}
),
)
async def repeat_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def repeat_action_to_code(config, action_id, template_arg, args):
var = cg.new_Pvariable(action_id, template_arg)
count_template = await cg.templatable(config[CONF_COUNT], args, cg.uint32)
cg.add(var.set_count(count_template))
@@ -383,14 +320,9 @@ _validate_wait_until = cv.maybe_simple_value(
@register_action("wait_until", WaitUntilAction, _validate_wait_until)
async def wait_until_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
condition = await build_condition(config[CONF_CONDITION], template_arg, args)
var = cg.new_Pvariable(action_id, template_arg, condition)
async def wait_until_action_to_code(config, action_id, template_arg, args):
conditions = await build_condition(config[CONF_CONDITION], template_arg, args)
var = cg.new_Pvariable(action_id, template_arg, conditions)
if CONF_TIMEOUT in config:
template_ = await cg.templatable(config[CONF_TIMEOUT], args, cg.uint32)
cg.add(var.set_timeout_value(template_))
@@ -399,12 +331,7 @@ async def wait_until_action_to_code(
@register_action("lambda", LambdaAction, cv.lambda_)
async def lambda_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def lambda_action_to_code(config, action_id, template_arg, args):
lambda_ = await cg.process_lambda(config, args, return_type=cg.void)
return cg.new_Pvariable(action_id, template_arg, lambda_)
@@ -418,12 +345,7 @@ async def lambda_action_to_code(
}
),
)
async def component_update_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def component_update_action_to_code(config, action_id, template_arg, args):
comp = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, comp)
@@ -437,12 +359,7 @@ async def component_update_action_to_code(
}
),
)
async def component_suspend_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def component_suspend_action_to_code(config, action_id, template_arg, args):
comp = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, comp)
@@ -459,12 +376,7 @@ async def component_suspend_action_to_code(
}
),
)
async def component_resume_action_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
) -> MockObj:
async def component_resume_action_to_code(config, action_id, template_arg, args):
comp = await cg.get_variable(config[CONF_ID])
var = cg.new_Pvariable(action_id, template_arg, comp)
if CONF_UPDATE_INTERVAL in config:
@@ -473,51 +385,43 @@ async def component_resume_action_to_code(
return var
async def build_action(
full_config: ConfigType, template_arg: cg.TemplateArguments, args: TemplateArgsType
) -> MockObj:
async def build_action(full_config, template_arg, args):
registry_entry, config = cg.extract_registry_entry_config(
ACTION_REGISTRY, full_config
)
action_id = full_config[CONF_TYPE_ID]
builder = registry_entry.coroutine_fun
return await builder(config, action_id, template_arg, args)
ret = await builder(config, action_id, template_arg, args)
return ret
async def build_action_list(
config: list[ConfigType], templ: cg.TemplateArguments, arg_type: TemplateArgsType
) -> list[MockObj]:
actions: list[MockObj] = []
async def build_action_list(config, templ, arg_type):
actions = []
for conf in config:
action = await build_action(conf, templ, arg_type)
actions.append(action)
return actions
async def build_condition(
full_config: ConfigType, template_arg: cg.TemplateArguments, args: TemplateArgsType
) -> MockObj:
async def build_condition(full_config, template_arg, args):
registry_entry, config = cg.extract_registry_entry_config(
CONDITION_REGISTRY, full_config
)
action_id = full_config[CONF_TYPE_ID]
builder = registry_entry.coroutine_fun
return await builder(config, action_id, template_arg, args)
ret = await builder(config, action_id, template_arg, args)
return ret
async def build_condition_list(
config: ConfigType, templ: cg.TemplateArguments, args: TemplateArgsType
) -> list[MockObj]:
conditions: list[MockObj] = []
async def build_condition_list(config, templ, args):
conditions = []
for conf in config:
condition = await build_condition(conf, templ, args)
conditions.append(condition)
return conditions
async def build_automation(
trigger: MockObj, args: TemplateArgsType, config: ConfigType
) -> MockObj:
async def build_automation(trigger, args, config):
arg_types = [arg[0] for arg in args]
templ = cg.TemplateArguments(*arg_types)
obj = cg.new_Pvariable(config[CONF_AUTOMATION_ID], templ, trigger)

View File

@@ -1,100 +0,0 @@
from esphome.const import __version__
from esphome.core import CORE
from esphome.helpers import mkdir_p, read_file, write_file_if_changed
from esphome.writer import find_begin_end, update_storage_json
INI_AUTO_GENERATE_BEGIN = "; ========== AUTO GENERATED CODE BEGIN ==========="
INI_AUTO_GENERATE_END = "; =========== AUTO GENERATED CODE END ============"
INI_BASE_FORMAT = (
"""; Auto generated code by esphome
[common]
lib_deps =
build_flags =
upload_flags =
""",
"""
""",
)
def format_ini(data: dict[str, str | list[str]]) -> str:
content = ""
for key, value in sorted(data.items()):
if isinstance(value, list):
content += f"{key} =\n"
for x in value:
content += f" {x}\n"
else:
content += f"{key} = {value}\n"
return content
def get_ini_content():
CORE.add_platformio_option(
"lib_deps",
[x.as_lib_dep for x in CORE.platformio_libraries.values()]
+ ["${common.lib_deps}"],
)
# Sort to avoid changing build flags order
CORE.add_platformio_option("build_flags", sorted(CORE.build_flags))
# Sort to avoid changing build unflags order
CORE.add_platformio_option("build_unflags", sorted(CORE.build_unflags))
# Add extra script for C++ flags
CORE.add_platformio_option("extra_scripts", [f"pre:{CXX_FLAGS_FILE_NAME}"])
content = "[platformio]\n"
content += f"description = ESPHome {__version__}\n"
content += f"[env:{CORE.name}]\n"
content += format_ini(CORE.platformio_options)
return content
def write_ini(content):
update_storage_json()
path = CORE.relative_build_path("platformio.ini")
if path.is_file():
text = read_file(path)
content_format = find_begin_end(
text, INI_AUTO_GENERATE_BEGIN, INI_AUTO_GENERATE_END
)
else:
content_format = INI_BASE_FORMAT
full_file = f"{content_format[0] + INI_AUTO_GENERATE_BEGIN}\n{content}"
full_file += INI_AUTO_GENERATE_END + content_format[1]
write_file_if_changed(path, full_file)
def write_project():
mkdir_p(CORE.build_path)
content = get_ini_content()
write_ini(content)
# Write extra script for C++ specific flags
write_cxx_flags_script()
CXX_FLAGS_FILE_NAME = "cxx_flags.py"
CXX_FLAGS_FILE_CONTENTS = """# Auto-generated ESPHome script for C++ specific compiler flags
Import("env")
# Add C++ specific flags
"""
def write_cxx_flags_script() -> None:
path = CORE.relative_build_path(CXX_FLAGS_FILE_NAME)
contents = CXX_FLAGS_FILE_CONTENTS
if not CORE.is_host:
contents += 'env.Append(CXXFLAGS=["-Wno-volatile"])'
contents += "\n"
write_file_if_changed(path, contents)

View File

@@ -12,7 +12,6 @@ from esphome.cpp_generator import ( # noqa: F401
ArrayInitializer,
Expression,
LineComment,
LogStringLiteral,
MockObj,
MockObjClass,
Pvariable,

View File

@@ -7,6 +7,7 @@ namespace a4988 {
static const char *const TAG = "a4988.stepper";
void A4988::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
if (this->sleep_pin_ != nullptr) {
this->sleep_pin_->setup();
this->sleep_pin_->digital_write(false);

View File

@@ -7,6 +7,8 @@ namespace absolute_humidity {
static const char *const TAG = "absolute_humidity.sensor";
void AbsoluteHumidityComponent::setup() {
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
ESP_LOGD(TAG, " Added callback for temperature '%s'", this->temperature_sensor_->get_name().c_str());
this->temperature_sensor_->add_on_state_callback([this](float state) { this->temperature_callback_(state); });
if (this->temperature_sensor_->has_state()) {
@@ -61,10 +63,11 @@ void AbsoluteHumidityComponent::loop() {
ESP_LOGW(TAG, "No valid state from temperature sensor!");
}
if (no_humidity) {
ESP_LOGW(TAG, "No valid state from humidity sensor!");
ESP_LOGW(TAG, "No valid state from temperature sensor!");
}
ESP_LOGW(TAG, "Unable to calculate absolute humidity.");
this->publish_state(NAN);
this->status_set_warning(LOG_STR("Unable to calculate absolute humidity."));
this->status_set_warning();
return;
}
@@ -86,8 +89,9 @@ void AbsoluteHumidityComponent::loop() {
es = es_wobus(temperature_c);
break;
default:
ESP_LOGE(TAG, "Invalid saturation vapor pressure equation selection!");
this->publish_state(NAN);
this->status_set_error("Invalid saturation vapor pressure equation selection!");
this->status_set_error();
return;
}
ESP_LOGD(TAG, "Saturation vapor pressure %f kPa", es);

View File

@@ -5,7 +5,7 @@ from esphome.const import (
CONF_EQUATION,
CONF_HUMIDITY,
CONF_TEMPERATURE,
DEVICE_CLASS_ABSOLUTE_HUMIDITY,
ICON_WATER,
STATE_CLASS_MEASUREMENT,
UNIT_GRAMS_PER_CUBIC_METER,
)
@@ -27,8 +27,8 @@ EQUATION = {
CONFIG_SCHEMA = (
sensor.sensor_schema(
unit_of_measurement=UNIT_GRAMS_PER_CUBIC_METER,
icon=ICON_WATER,
accuracy_decimals=2,
device_class=DEVICE_CLASS_ABSOLUTE_HUMIDITY,
state_class=STATE_CLASS_MEASUREMENT,
)
.extend(

View File

@@ -1,6 +1,6 @@
from esphome import pins
import esphome.codegen as cg
from esphome.components.esp32 import VARIANT_ESP32P4, get_esp32_variant
from esphome.components.esp32 import get_esp32_variant
from esphome.components.esp32.const import (
VARIANT_ESP32,
VARIANT_ESP32C2,
@@ -11,8 +11,15 @@ from esphome.components.esp32.const import (
VARIANT_ESP32S2,
VARIANT_ESP32S3,
)
from esphome.config_helpers import filter_source_files_from_platform
import esphome.config_validation as cv
from esphome.const import CONF_ANALOG, CONF_INPUT, CONF_NUMBER, PLATFORM_ESP8266
from esphome.const import (
CONF_ANALOG,
CONF_INPUT,
CONF_NUMBER,
PLATFORM_ESP8266,
PlatformFramework,
)
from esphome.core import CORE
CODEOWNERS = ["@esphome/core"]
@@ -133,16 +140,6 @@ ESP32_VARIANT_ADC1_PIN_TO_CHANNEL = {
9: adc_channel_t.ADC_CHANNEL_8,
10: adc_channel_t.ADC_CHANNEL_9,
},
VARIANT_ESP32P4: {
16: adc_channel_t.ADC_CHANNEL_0,
17: adc_channel_t.ADC_CHANNEL_1,
18: adc_channel_t.ADC_CHANNEL_2,
19: adc_channel_t.ADC_CHANNEL_3,
20: adc_channel_t.ADC_CHANNEL_4,
21: adc_channel_t.ADC_CHANNEL_5,
22: adc_channel_t.ADC_CHANNEL_6,
23: adc_channel_t.ADC_CHANNEL_7,
},
}
# pin to adc2 channel mapping
@@ -201,14 +198,6 @@ ESP32_VARIANT_ADC2_PIN_TO_CHANNEL = {
19: adc_channel_t.ADC_CHANNEL_8,
20: adc_channel_t.ADC_CHANNEL_9,
},
VARIANT_ESP32P4: {
49: adc_channel_t.ADC_CHANNEL_0,
50: adc_channel_t.ADC_CHANNEL_1,
51: adc_channel_t.ADC_CHANNEL_2,
52: adc_channel_t.ADC_CHANNEL_3,
53: adc_channel_t.ADC_CHANNEL_4,
54: adc_channel_t.ADC_CHANNEL_5,
},
}
@@ -260,9 +249,21 @@ def validate_adc_pin(value):
{CONF_ANALOG: True, CONF_INPUT: True}, internal=True
)(value)
if CORE.is_nrf52:
return pins.gpio_pin_schema(
{CONF_ANALOG: True, CONF_INPUT: True}, internal=True
)(value)
raise NotImplementedError
FILTER_SOURCE_FILES = filter_source_files_from_platform(
{
"adc_sensor_esp32.cpp": {
PlatformFramework.ESP32_ARDUINO,
PlatformFramework.ESP32_IDF,
},
"adc_sensor_esp8266.cpp": {PlatformFramework.ESP8266_ARDUINO},
"adc_sensor_rp2040.cpp": {PlatformFramework.RP2040_ARDUINO},
"adc_sensor_libretiny.cpp": {
PlatformFramework.BK72XX_ARDUINO,
PlatformFramework.RTL87XX_ARDUINO,
PlatformFramework.LN882X_ARDUINO,
},
}
)

View File

@@ -13,10 +13,6 @@
#include "hal/adc_types.h" // This defines ADC_CHANNEL_MAX
#endif // USE_ESP32
#ifdef USE_ZEPHYR
#include <zephyr/drivers/adc.h>
#endif
namespace esphome {
namespace adc {
@@ -42,15 +38,15 @@ enum class SamplingMode : uint8_t {
const LogString *sampling_mode_to_str(SamplingMode mode);
template<typename T> class Aggregator {
class Aggregator {
public:
Aggregator(SamplingMode mode);
void add_sample(T value);
T aggregate();
void add_sample(uint32_t value);
uint32_t aggregate();
protected:
T aggr_{0};
uint8_t samples_{0};
uint32_t aggr_{0};
uint32_t samples_{0};
SamplingMode mode_{SamplingMode::AVG};
};
@@ -73,11 +69,6 @@ class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage
/// @return A float representing the setup priority.
float get_setup_priority() const override;
#ifdef USE_ZEPHYR
/// Set the ADC channel to be used by the ADC sensor.
/// @param channel Pointer to an adc_dt_spec structure representing the ADC channel.
void set_adc_channel(const adc_dt_spec *channel) { this->channel_ = channel; }
#endif
/// Set the GPIO pin to be used by the ADC sensor.
/// @param pin Pointer to an InternalGPIOPin representing the ADC input pin.
void set_pin(InternalGPIOPin *pin) { this->pin_ = pin; }
@@ -145,8 +136,8 @@ class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage
adc_oneshot_unit_handle_t adc_handle_{nullptr};
adc_cali_handle_t calibration_handle_{nullptr};
adc_atten_t attenuation_{ADC_ATTEN_DB_0};
adc_channel_t channel_{};
adc_unit_t adc_unit_{};
adc_channel_t channel_;
adc_unit_t adc_unit_;
struct SetupFlags {
uint8_t init_complete : 1;
uint8_t config_complete : 1;
@@ -160,10 +151,6 @@ class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage
#ifdef USE_RP2040
bool is_temperature_{false};
#endif // USE_RP2040
#ifdef USE_ZEPHYR
const struct adc_dt_spec *channel_ = nullptr;
#endif
};
} // namespace adc

View File

@@ -18,15 +18,15 @@ const LogString *sampling_mode_to_str(SamplingMode mode) {
return LOG_STR("unknown");
}
template<typename T> Aggregator<T>::Aggregator(SamplingMode mode) {
Aggregator::Aggregator(SamplingMode mode) {
this->mode_ = mode;
// set to max uint if mode is "min"
if (mode == SamplingMode::MIN) {
this->aggr_ = std::numeric_limits<T>::max();
this->aggr_ = UINT32_MAX;
}
}
template<typename T> void Aggregator<T>::add_sample(T value) {
void Aggregator::add_sample(uint32_t value) {
this->samples_ += 1;
switch (this->mode_) {
@@ -47,7 +47,7 @@ template<typename T> void Aggregator<T>::add_sample(T value) {
}
}
template<typename T> T Aggregator<T>::aggregate() {
uint32_t Aggregator::aggregate() {
if (this->mode_ == SamplingMode::AVG) {
if (this->samples_ == 0) {
return this->aggr_;
@@ -59,12 +59,6 @@ template<typename T> T Aggregator<T>::aggregate() {
return this->aggr_;
}
#ifdef USE_ZEPHYR
template class Aggregator<int32_t>;
#else
template class Aggregator<uint32_t>;
#endif
void ADCSensor::update() {
float value_v = this->sample();
ESP_LOGV(TAG, "'%s': Voltage=%.4fV", this->get_name().c_str(), value_v);

View File

@@ -37,6 +37,7 @@ const LogString *adc_unit_to_str(adc_unit_t unit) {
}
void ADCSensor::setup() {
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
// Check if another sensor already initialized this ADC unit
if (ADCSensor::shared_adc_handles[this->adc_unit_] == nullptr) {
adc_oneshot_unit_init_cfg_t init_config = {}; // Zero initialize
@@ -72,9 +73,10 @@ void ADCSensor::setup() {
// Initialize ADC calibration
if (this->calibration_handle_ == nullptr) {
adc_cali_handle_t handle = nullptr;
esp_err_t err;
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
// RISC-V variants and S3 use curve fitting calibration
adc_cali_curve_fitting_config_t cali_config = {}; // Zero initialize first
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
@@ -152,7 +154,7 @@ float ADCSensor::sample() {
}
float ADCSensor::sample_fixed_attenuation_() {
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
auto aggr = Aggregator(this->sampling_mode_);
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
int raw;
@@ -186,7 +188,7 @@ float ADCSensor::sample_fixed_attenuation_() {
ESP_LOGW(TAG, "ADC calibration conversion failed with error %d, disabling calibration", err);
if (this->calibration_handle_ != nullptr) {
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
#else // Other ESP32 variants use line fitting calibration
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
@@ -219,7 +221,7 @@ float ADCSensor::sample_autorange_() {
if (this->calibration_handle_ != nullptr) {
// Delete old calibration handle
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
#else
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
@@ -231,7 +233,7 @@ float ADCSensor::sample_autorange_() {
adc_cali_handle_t handle = nullptr;
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
adc_cali_curve_fitting_config_t cali_config = {};
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
cali_config.chan = this->channel_;
@@ -241,8 +243,6 @@ float ADCSensor::sample_autorange_() {
cali_config.bitwidth = ADC_BITWIDTH_DEFAULT;
err = adc_cali_create_scheme_curve_fitting(&cali_config, &handle);
ESP_LOGVV(TAG, "Autorange atten=%d: Calibration handle creation %s (err=%d)", atten,
(err == ESP_OK) ? "SUCCESS" : "FAILED", err);
#else
adc_cali_line_fitting_config_t cali_config = {
.unit_id = this->adc_unit_,
@@ -253,20 +253,16 @@ float ADCSensor::sample_autorange_() {
#endif
};
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
ESP_LOGVV(TAG, "Autorange atten=%d: Calibration handle creation %s (err=%d)", atten,
(err == ESP_OK) ? "SUCCESS" : "FAILED", err);
#endif
int raw;
err = adc_oneshot_read(this->adc_handle_, this->channel_, &raw);
ESP_LOGVV(TAG, "Autorange atten=%d: Raw ADC read %s, value=%d (err=%d)", atten,
(err == ESP_OK) ? "SUCCESS" : "FAILED", raw, err);
if (err != ESP_OK) {
ESP_LOGW(TAG, "ADC read failed in autorange with error %d", err);
if (handle != nullptr) {
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
adc_cali_delete_scheme_curve_fitting(handle);
#else
adc_cali_delete_scheme_line_fitting(handle);
@@ -281,21 +277,18 @@ float ADCSensor::sample_autorange_() {
err = adc_cali_raw_to_voltage(handle, raw, &voltage_mv);
if (err == ESP_OK) {
voltage = voltage_mv / 1000.0f;
ESP_LOGVV(TAG, "Autorange atten=%d: CALIBRATED - raw=%d -> %dmV -> %.6fV", atten, raw, voltage_mv, voltage);
} else {
voltage = raw * 3.3f / 4095.0f;
ESP_LOGVV(TAG, "Autorange atten=%d: UNCALIBRATED FALLBACK - raw=%d -> %.6fV (3.3V ref)", atten, raw, voltage);
}
// Clean up calibration handle
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4
USE_ESP32_VARIANT_ESP32S3 || USE_ESP32_VARIANT_ESP32H2
adc_cali_delete_scheme_curve_fitting(handle);
#else
adc_cali_delete_scheme_line_fitting(handle);
#endif
} else {
voltage = raw * 3.3f / 4095.0f;
ESP_LOGVV(TAG, "Autorange atten=%d: NO CALIBRATION - raw=%d -> %.6fV (3.3V ref)", atten, raw, voltage);
}
return {raw, voltage};
@@ -333,32 +326,18 @@ float ADCSensor::sample_autorange_() {
}
const int adc_half = 2048;
const uint32_t c12 = std::min(raw12, adc_half);
const int32_t c6_signed = adc_half - std::abs(raw6 - adc_half);
const uint32_t c6 = (c6_signed > 0) ? c6_signed : 0; // Clamp to prevent underflow
const int32_t c2_signed = adc_half - std::abs(raw2 - adc_half);
const uint32_t c2 = (c2_signed > 0) ? c2_signed : 0; // Clamp to prevent underflow
const uint32_t c0 = std::min(4095 - raw0, adc_half);
const uint32_t csum = c12 + c6 + c2 + c0;
ESP_LOGVV(TAG, "Autorange summary:");
ESP_LOGVV(TAG, " Raw readings: 12db=%d, 6db=%d, 2.5db=%d, 0db=%d", raw12, raw6, raw2, raw0);
ESP_LOGVV(TAG, " Voltages: 12db=%.6f, 6db=%.6f, 2.5db=%.6f, 0db=%.6f", mv12, mv6, mv2, mv0);
ESP_LOGVV(TAG, " Coefficients: c12=%u, c6=%u, c2=%u, c0=%u, sum=%u", c12, c6, c2, c0, csum);
uint32_t c12 = std::min(raw12, adc_half);
uint32_t c6 = adc_half - std::abs(raw6 - adc_half);
uint32_t c2 = adc_half - std::abs(raw2 - adc_half);
uint32_t c0 = std::min(4095 - raw0, adc_half);
uint32_t csum = c12 + c6 + c2 + c0;
if (csum == 0) {
ESP_LOGE(TAG, "Invalid weight sum in autorange calculation");
return NAN;
}
const float final_result = (mv12 * c12 + mv6 * c6 + mv2 * c2 + mv0 * c0) / csum;
ESP_LOGV(TAG, "Autorange final: (%.6f*%u + %.6f*%u + %.6f*%u + %.6f*%u)/%u = %.6fV", mv12, c12, mv6, c6, mv2, c2, mv0,
c0, csum, final_result);
return final_result;
return (mv12 * c12 + mv6 * c6 + mv2 * c2 + mv0 * c0) / csum;
}
} // namespace adc

View File

@@ -17,6 +17,7 @@ namespace adc {
static const char *const TAG = "adc.esp8266";
void ADCSensor::setup() {
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
#ifndef USE_ADC_SENSOR_VCC
this->pin_->setup();
#endif
@@ -37,7 +38,7 @@ void ADCSensor::dump_config() {
}
float ADCSensor::sample() {
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
auto aggr = Aggregator(this->sampling_mode_);
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
uint32_t raw = 0;

View File

@@ -9,6 +9,7 @@ namespace adc {
static const char *const TAG = "adc.libretiny";
void ADCSensor::setup() {
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
#ifndef USE_ADC_SENSOR_VCC
this->pin_->setup();
#endif // !USE_ADC_SENSOR_VCC
@@ -30,7 +31,7 @@ void ADCSensor::dump_config() {
float ADCSensor::sample() {
uint32_t raw = 0;
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
auto aggr = Aggregator(this->sampling_mode_);
if (this->output_raw_) {
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {

View File

@@ -14,6 +14,7 @@ namespace adc {
static const char *const TAG = "adc.rp2040";
void ADCSensor::setup() {
ESP_LOGCONFIG(TAG, "Running setup for '%s'", this->get_name().c_str());
static bool initialized = false;
if (!initialized) {
adc_init();
@@ -41,7 +42,7 @@ void ADCSensor::dump_config() {
float ADCSensor::sample() {
uint32_t raw = 0;
auto aggr = Aggregator<uint32_t>(this->sampling_mode_);
auto aggr = Aggregator(this->sampling_mode_);
if (this->is_temperature_) {
adc_set_temp_sensor_enabled(true);

View File

@@ -1,207 +0,0 @@
#include "adc_sensor.h"
#ifdef USE_ZEPHYR
#include "esphome/core/log.h"
#include "hal/nrf_saadc.h"
namespace esphome {
namespace adc {
static const char *const TAG = "adc.zephyr";
void ADCSensor::setup() {
if (!adc_is_ready_dt(this->channel_)) {
ESP_LOGE(TAG, "ADC controller device %s not ready", this->channel_->dev->name);
return;
}
auto err = adc_channel_setup_dt(this->channel_);
if (err < 0) {
ESP_LOGE(TAG, "Could not setup channel %s (%d)", this->channel_->dev->name, err);
return;
}
}
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
static const LogString *gain_to_str(enum adc_gain gain) {
switch (gain) {
case ADC_GAIN_1_6:
return LOG_STR("1/6");
case ADC_GAIN_1_5:
return LOG_STR("1/5");
case ADC_GAIN_1_4:
return LOG_STR("1/4");
case ADC_GAIN_1_3:
return LOG_STR("1/3");
case ADC_GAIN_2_5:
return LOG_STR("2/5");
case ADC_GAIN_1_2:
return LOG_STR("1/2");
case ADC_GAIN_2_3:
return LOG_STR("2/3");
case ADC_GAIN_4_5:
return LOG_STR("4/5");
case ADC_GAIN_1:
return LOG_STR("1");
case ADC_GAIN_2:
return LOG_STR("2");
case ADC_GAIN_3:
return LOG_STR("3");
case ADC_GAIN_4:
return LOG_STR("4");
case ADC_GAIN_6:
return LOG_STR("6");
case ADC_GAIN_8:
return LOG_STR("8");
case ADC_GAIN_12:
return LOG_STR("12");
case ADC_GAIN_16:
return LOG_STR("16");
case ADC_GAIN_24:
return LOG_STR("24");
case ADC_GAIN_32:
return LOG_STR("32");
case ADC_GAIN_64:
return LOG_STR("64");
case ADC_GAIN_128:
return LOG_STR("128");
}
return LOG_STR("undefined gain");
}
static const LogString *reference_to_str(enum adc_reference reference) {
switch (reference) {
case ADC_REF_VDD_1:
return LOG_STR("VDD");
case ADC_REF_VDD_1_2:
return LOG_STR("VDD/2");
case ADC_REF_VDD_1_3:
return LOG_STR("VDD/3");
case ADC_REF_VDD_1_4:
return LOG_STR("VDD/4");
case ADC_REF_INTERNAL:
return LOG_STR("INTERNAL");
case ADC_REF_EXTERNAL0:
return LOG_STR("External, input 0");
case ADC_REF_EXTERNAL1:
return LOG_STR("External, input 1");
}
return LOG_STR("undefined reference");
}
static const LogString *input_to_str(uint8_t input) {
switch (input) {
case NRF_SAADC_INPUT_AIN0:
return LOG_STR("AIN0");
case NRF_SAADC_INPUT_AIN1:
return LOG_STR("AIN1");
case NRF_SAADC_INPUT_AIN2:
return LOG_STR("AIN2");
case NRF_SAADC_INPUT_AIN3:
return LOG_STR("AIN3");
case NRF_SAADC_INPUT_AIN4:
return LOG_STR("AIN4");
case NRF_SAADC_INPUT_AIN5:
return LOG_STR("AIN5");
case NRF_SAADC_INPUT_AIN6:
return LOG_STR("AIN6");
case NRF_SAADC_INPUT_AIN7:
return LOG_STR("AIN7");
case NRF_SAADC_INPUT_VDD:
return LOG_STR("VDD");
case NRF_SAADC_INPUT_VDDHDIV5:
return LOG_STR("VDDHDIV5");
}
return LOG_STR("undefined input");
}
#endif // ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
void ADCSensor::dump_config() {
LOG_SENSOR("", "ADC Sensor", this);
LOG_PIN(" Pin: ", this->pin_);
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERBOSE
ESP_LOGV(TAG,
" Name: %s\n"
" Channel: %d\n"
" vref_mv: %d\n"
" Resolution %d\n"
" Oversampling %d",
this->channel_->dev->name, this->channel_->channel_id, this->channel_->vref_mv, this->channel_->resolution,
this->channel_->oversampling);
ESP_LOGV(TAG,
" Gain: %s\n"
" reference: %s\n"
" acquisition_time: %d\n"
" differential %s",
LOG_STR_ARG(gain_to_str(this->channel_->channel_cfg.gain)),
LOG_STR_ARG(reference_to_str(this->channel_->channel_cfg.reference)),
this->channel_->channel_cfg.acquisition_time, YESNO(this->channel_->channel_cfg.differential));
if (this->channel_->channel_cfg.differential) {
ESP_LOGV(TAG,
" Positive: %s\n"
" Negative: %s",
LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_positive)),
LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_negative)));
} else {
ESP_LOGV(TAG, " Positive: %s", LOG_STR_ARG(input_to_str(this->channel_->channel_cfg.input_positive)));
}
#endif
LOG_UPDATE_INTERVAL(this);
}
float ADCSensor::sample() {
auto aggr = Aggregator<int32_t>(this->sampling_mode_);
int err;
for (uint8_t sample = 0; sample < this->sample_count_; sample++) {
int16_t buf = 0;
struct adc_sequence sequence = {
.buffer = &buf,
/* buffer size in bytes, not number of samples */
.buffer_size = sizeof(buf),
};
int32_t val_raw;
err = adc_sequence_init_dt(this->channel_, &sequence);
if (err < 0) {
ESP_LOGE(TAG, "Could sequence init %s (%d)", this->channel_->dev->name, err);
return 0.0;
}
err = adc_read(this->channel_->dev, &sequence);
if (err < 0) {
ESP_LOGE(TAG, "Could not read %s (%d)", this->channel_->dev->name, err);
return 0.0;
}
val_raw = (int32_t) buf;
if (!this->channel_->channel_cfg.differential) {
// https://github.com/adafruit/Adafruit_nRF52_Arduino/blob/0ed4d9ffc674ae407be7cacf5696a02f5e789861/cores/nRF5/wiring_analog_nRF52.c#L222
if (val_raw < 0) {
val_raw = 0;
}
}
aggr.add_sample(val_raw);
}
int32_t val_mv = aggr.aggregate();
if (this->output_raw_) {
return val_mv;
}
err = adc_raw_to_millivolts_dt(this->channel_, &val_mv);
/* conversion to mV may not be supported, skip if not */
if (err < 0) {
ESP_LOGE(TAG, "Value in mV not available %s (%d)", this->channel_->dev->name, err);
return 0.0;
}
return val_mv / 1000.0f;
}
} // namespace adc
} // namespace esphome
#endif

View File

@@ -3,13 +3,6 @@ import logging
import esphome.codegen as cg
from esphome.components import sensor, voltage_sampler
from esphome.components.esp32 import get_esp32_variant
from esphome.components.nrf52.const import AIN_TO_GPIO, EXTRA_ADC
from esphome.components.zephyr import (
zephyr_add_overlay,
zephyr_add_prj_conf,
zephyr_add_user,
)
from esphome.config_helpers import filter_source_files_from_platform
import esphome.config_validation as cv
from esphome.const import (
CONF_ATTENUATION,
@@ -18,10 +11,8 @@ from esphome.const import (
CONF_PIN,
CONF_RAW,
DEVICE_CLASS_VOLTAGE,
PLATFORM_NRF52,
STATE_CLASS_MEASUREMENT,
UNIT_VOLT,
PlatformFramework,
)
from esphome.core import CORE
@@ -69,10 +60,6 @@ ADCSensor = adc_ns.class_(
"ADCSensor", sensor.Sensor, cg.PollingComponent, voltage_sampler.VoltageSampler
)
CONF_NRF_SAADC = "nrf_saadc"
adc_dt_spec = cg.global_ns.class_("adc_dt_spec")
CONFIG_SCHEMA = cv.All(
sensor.sensor_schema(
ADCSensor,
@@ -88,7 +75,6 @@ CONFIG_SCHEMA = cv.All(
cv.SplitDefault(CONF_ATTENUATION, esp32="0db"): cv.All(
cv.only_on_esp32, _attenuation
),
cv.OnlyWith(CONF_NRF_SAADC, PLATFORM_NRF52): cv.declare_id(adc_dt_spec),
cv.Optional(CONF_SAMPLES, default=1): cv.int_range(min=1, max=255),
cv.Optional(CONF_SAMPLING_MODE, default="avg"): _sampling_mode,
}
@@ -97,8 +83,6 @@ CONFIG_SCHEMA = cv.All(
validate_config,
)
CONF_ADC_CHANNEL_ID = "adc_channel_id"
async def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
@@ -109,7 +93,7 @@ async def to_code(config):
cg.add_define("USE_ADC_SENSOR_VCC")
elif config[CONF_PIN] == "TEMPERATURE":
cg.add(var.set_is_temperature())
elif not CORE.is_nrf52 or config[CONF_PIN][CONF_NUMBER] not in EXTRA_ADC:
else:
pin = await cg.gpio_pin_expression(config[CONF_PIN])
cg.add(var.set_pin(pin))
@@ -138,59 +122,3 @@ async def to_code(config):
):
chan = ESP32_VARIANT_ADC2_PIN_TO_CHANNEL[variant][pin_num]
cg.add(var.set_channel(adc_unit_t.ADC_UNIT_2, chan))
elif CORE.is_nrf52:
CORE.data.setdefault(CONF_ADC_CHANNEL_ID, 0)
channel_id = CORE.data[CONF_ADC_CHANNEL_ID]
CORE.data[CONF_ADC_CHANNEL_ID] = channel_id + 1
zephyr_add_prj_conf("ADC", True)
nrf_saadc = config[CONF_NRF_SAADC]
rhs = cg.RawExpression(
f"ADC_DT_SPEC_GET_BY_IDX(DT_PATH(zephyr_user), {channel_id})"
)
adc = cg.new_Pvariable(nrf_saadc, rhs)
cg.add(var.set_adc_channel(adc))
gain = "ADC_GAIN_1_6"
pin_number = config[CONF_PIN][CONF_NUMBER]
if pin_number == "VDDHDIV5":
gain = "ADC_GAIN_1_2"
if isinstance(pin_number, int):
GPIO_TO_AIN = {v: k for k, v in AIN_TO_GPIO.items()}
pin_number = GPIO_TO_AIN[pin_number]
zephyr_add_user("io-channels", f"<&adc {channel_id}>")
zephyr_add_overlay(
f"""
&adc {{
#address-cells = <1>;
#size-cells = <0>;
channel@{channel_id} {{
reg = <{channel_id}>;
zephyr,gain = "{gain}";
zephyr,reference = "ADC_REF_INTERNAL";
zephyr,acquisition-time = <ADC_ACQ_TIME_DEFAULT>;
zephyr,input-positive = <NRF_SAADC_{pin_number}>;
zephyr,resolution = <14>;
zephyr,oversampling = <8>;
}};
}};
"""
)
FILTER_SOURCE_FILES = filter_source_files_from_platform(
{
"adc_sensor_esp32.cpp": {
PlatformFramework.ESP32_ARDUINO,
PlatformFramework.ESP32_IDF,
},
"adc_sensor_esp8266.cpp": {PlatformFramework.ESP8266_ARDUINO},
"adc_sensor_rp2040.cpp": {PlatformFramework.RP2040_ARDUINO},
"adc_sensor_libretiny.cpp": {
PlatformFramework.BK72XX_ARDUINO,
PlatformFramework.RTL87XX_ARDUINO,
PlatformFramework.LN882X_ARDUINO,
},
"adc_sensor_zephyr.cpp": {PlatformFramework.NRF52_ZEPHYR},
}
)

View File

@@ -8,7 +8,10 @@ static const char *const TAG = "adc128s102";
float ADC128S102::get_setup_priority() const { return setup_priority::HARDWARE; }
void ADC128S102::setup() { this->spi_setup(); }
void ADC128S102::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
this->spi_setup();
}
void ADC128S102::dump_config() {
ESP_LOGCONFIG(TAG, "ADC128S102:");

View File

@@ -113,7 +113,7 @@ void ADE7880::update() {
if (this->channel_a_ != nullptr) {
auto *chan = this->channel_a_;
this->update_sensor_from_s24zp_register16_(chan->current, AIRMS, [](float val) { return val / 100000.0f; });
this->update_sensor_from_s24zp_register16_(chan->voltage, AVRMS, [](float val) { return val / 10000.0f; });
this->update_sensor_from_s24zp_register16_(chan->voltage, BVRMS, [](float val) { return val / 10000.0f; });
this->update_sensor_from_s24zp_register16_(chan->active_power, AWATT, [](float val) { return val / 100.0f; });
this->update_sensor_from_s24zp_register16_(chan->apparent_power, AVA, [](float val) { return val / 100.0f; });
this->update_sensor_from_s16_register16_(chan->power_factor, APF,

View File

@@ -36,7 +36,6 @@ from esphome.const import (
UNIT_WATT,
UNIT_WATT_HOURS,
)
from esphome.types import ConfigType
DEPENDENCIES = ["i2c"]
@@ -52,20 +51,6 @@ CONF_POWER_GAIN = "power_gain"
CONF_NEUTRAL = "neutral"
# Tuple of power channel phases
POWER_PHASES = (CONF_PHASE_A, CONF_PHASE_B, CONF_PHASE_C)
# Tuple of sensor types that can be configured for power channels
POWER_SENSOR_TYPES = (
CONF_CURRENT,
CONF_VOLTAGE,
CONF_ACTIVE_POWER,
CONF_APPARENT_POWER,
CONF_POWER_FACTOR,
CONF_FORWARD_ACTIVE_ENERGY,
CONF_REVERSE_ACTIVE_ENERGY,
)
NEUTRAL_CHANNEL_SCHEMA = cv.Schema(
{
cv.GenerateID(): cv.declare_id(NeutralChannel),
@@ -165,64 +150,7 @@ POWER_CHANNEL_SCHEMA = cv.Schema(
}
)
def prefix_sensor_name(
sensor_conf: ConfigType,
channel_name: str,
channel_config: ConfigType,
sensor_type: str,
) -> None:
"""Helper to prefix sensor name with channel name.
Args:
sensor_conf: The sensor configuration (dict or string)
channel_name: The channel name to prefix with
channel_config: The channel configuration to update
sensor_type: The sensor type key in the channel config
"""
if isinstance(sensor_conf, dict) and CONF_NAME in sensor_conf:
sensor_name = sensor_conf[CONF_NAME]
if sensor_name and not sensor_name.startswith(channel_name):
sensor_conf[CONF_NAME] = f"{channel_name} {sensor_name}"
elif isinstance(sensor_conf, str):
# Simple value case - convert to dict with prefixed name
channel_config[sensor_type] = {CONF_NAME: f"{channel_name} {sensor_conf}"}
def process_channel_sensors(
config: ConfigType, channel_key: str, sensor_types: tuple
) -> None:
"""Process sensors for a channel and prefix their names.
Args:
config: The main configuration
channel_key: The channel key (e.g., CONF_PHASE_A, CONF_NEUTRAL)
sensor_types: Tuple of sensor types to process for this channel
"""
if not (channel_config := config.get(channel_key)) or not (
channel_name := channel_config.get(CONF_NAME)
):
return
for sensor_type in sensor_types:
if sensor_conf := channel_config.get(sensor_type):
prefix_sensor_name(sensor_conf, channel_name, channel_config, sensor_type)
def preprocess_channels(config: ConfigType) -> ConfigType:
"""Preprocess channel configurations to add channel name prefix to sensor names."""
# Process power channels
for channel in POWER_PHASES:
process_channel_sensors(config, channel, POWER_SENSOR_TYPES)
# Process neutral channel
process_channel_sensors(config, CONF_NEUTRAL, (CONF_CURRENT,))
return config
CONFIG_SCHEMA = cv.All(
preprocess_channels,
CONFIG_SCHEMA = (
cv.Schema(
{
cv.GenerateID(): cv.declare_id(ADE7880),
@@ -239,7 +167,7 @@ CONFIG_SCHEMA = cv.All(
}
)
.extend(cv.polling_component_schema("60s"))
.extend(i2c.i2c_device_schema(0x38)),
.extend(i2c.i2c_device_schema(0x38))
)
@@ -260,7 +188,15 @@ async def neutral_channel(config):
async def power_channel(config):
var = cg.new_Pvariable(config[CONF_ID])
for sensor_type in POWER_SENSOR_TYPES:
for sensor_type in [
CONF_CURRENT,
CONF_VOLTAGE,
CONF_ACTIVE_POWER,
CONF_APPARENT_POWER,
CONF_POWER_FACTOR,
CONF_FORWARD_ACTIVE_ENERGY,
CONF_REVERSE_ACTIVE_ENERGY,
]:
if conf := config.get(sensor_type):
sens = await sensor.new_sensor(conf)
cg.add(getattr(var, f"set_{sensor_type}")(sens))
@@ -280,6 +216,44 @@ async def power_channel(config):
return var
def final_validate(config):
for channel in [CONF_PHASE_A, CONF_PHASE_B, CONF_PHASE_C]:
if channel := config.get(channel):
channel_name = channel.get(CONF_NAME)
for sensor_type in [
CONF_CURRENT,
CONF_VOLTAGE,
CONF_ACTIVE_POWER,
CONF_APPARENT_POWER,
CONF_POWER_FACTOR,
CONF_FORWARD_ACTIVE_ENERGY,
CONF_REVERSE_ACTIVE_ENERGY,
]:
if conf := channel.get(sensor_type):
sensor_name = conf.get(CONF_NAME)
if (
sensor_name
and channel_name
and not sensor_name.startswith(channel_name)
):
conf[CONF_NAME] = f"{channel_name} {sensor_name}"
if channel := config.get(CONF_NEUTRAL):
channel_name = channel.get(CONF_NAME)
if conf := channel.get(CONF_CURRENT):
sensor_name = conf.get(CONF_NAME)
if (
sensor_name
and channel_name
and not sensor_name.startswith(channel_name)
):
conf[CONF_NAME] = f"{channel_name} {sensor_name}"
FINAL_VALIDATE_SCHEMA = final_validate
async def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
await cg.register_component(var, config)

View File

@@ -10,6 +10,7 @@ static const uint8_t ADS1115_REGISTER_CONVERSION = 0x00;
static const uint8_t ADS1115_REGISTER_CONFIG = 0x01;
void ADS1115Component::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
uint16_t value;
if (!this->read_byte_16(ADS1115_REGISTER_CONVERSION, &value)) {
this->mark_failed();

View File

@@ -9,6 +9,7 @@ static const char *const TAG = "ads1118";
static const uint8_t ADS1118_DATA_RATE_860_SPS = 0b111;
void ADS1118::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
this->spi_setup();
this->config_ = 0;

View File

@@ -24,6 +24,8 @@ static const uint16_t ZP_CURRENT = 0x0000;
static const uint16_t ZP_DEFAULT = 0xFFFF;
void AGS10Component::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
auto version = this->read_version_();
if (version) {
ESP_LOGD(TAG, "AGS10 Sensor Version: 0x%02X", *version);
@@ -43,6 +45,8 @@ void AGS10Component::setup() {
} else {
ESP_LOGE(TAG, "AGS10 Sensor Resistance: unknown");
}
ESP_LOGD(TAG, "Sensor initialized");
}
void AGS10Component::update() {
@@ -89,7 +93,7 @@ void AGS10Component::dump_config() {
bool AGS10Component::new_i2c_address(uint8_t newaddress) {
uint8_t rev_newaddress = ~newaddress;
std::array<uint8_t, 5> data{newaddress, rev_newaddress, newaddress, rev_newaddress, 0};
data[4] = crc8(data.data(), 4, 0xFF, 0x31, true);
data[4] = calc_crc8_(data, 4);
if (!this->write_bytes(REG_ADDRESS, data)) {
this->error_code_ = COMMUNICATION_FAILED;
this->status_set_warning();
@@ -109,7 +113,7 @@ bool AGS10Component::set_zero_point_with_current_resistance() { return this->set
bool AGS10Component::set_zero_point_with(uint16_t value) {
std::array<uint8_t, 5> data{0x00, 0x0C, (uint8_t) ((value >> 8) & 0xFF), (uint8_t) (value & 0xFF), 0};
data[4] = crc8(data.data(), 4, 0xFF, 0x31, true);
data[4] = calc_crc8_(data, 4);
if (!this->write_bytes(REG_CALIBRATION, data)) {
this->error_code_ = COMMUNICATION_FAILED;
this->status_set_warning();
@@ -184,7 +188,7 @@ template<size_t N> optional<std::array<uint8_t, N>> AGS10Component::read_and_che
auto res = *data;
auto crc_byte = res[len];
if (crc_byte != crc8(res.data(), len, 0xFF, 0x31, true)) {
if (crc_byte != calc_crc8_(res, len)) {
this->error_code_ = CRC_CHECK_FAILED;
ESP_LOGE(TAG, "Reading AGS10 version failed: crc error!");
return optional<std::array<uint8_t, N>>();
@@ -192,5 +196,20 @@ template<size_t N> optional<std::array<uint8_t, N>> AGS10Component::read_and_che
return data;
}
template<size_t N> uint8_t AGS10Component::calc_crc8_(std::array<uint8_t, N> dat, uint8_t num) {
uint8_t i, byte1, crc = 0xFF;
for (byte1 = 0; byte1 < num; byte1++) {
crc ^= (dat[byte1]);
for (i = 0; i < 8; i++) {
if (crc & 0x80) {
crc = (crc << 1) ^ 0x31;
} else {
crc = (crc << 1);
}
}
}
return crc;
}
} // namespace ags10
} // namespace esphome

View File

@@ -1,9 +1,9 @@
#pragma once
#include "esphome/components/i2c/i2c.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/core/automation.h"
#include "esphome/core/component.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/components/i2c/i2c.h"
namespace esphome {
namespace ags10 {
@@ -99,6 +99,16 @@ class AGS10Component : public PollingComponent, public i2c::I2CDevice {
* Read, checks and returns data from the sensor.
*/
template<size_t N> optional<std::array<uint8_t, N>> read_and_check_(uint8_t a_register);
/**
* Calculates CRC8 value.
*
* CRC8 calculation, initial value: 0xFF, polynomial: 0x31 (x8+ x5+ x4+1)
*
* @param[in] dat the data buffer
* @param num number of bytes in the buffer
*/
template<size_t N> uint8_t calc_crc8_(std::array<uint8_t, N> dat, uint8_t num);
};
template<typename... Ts> class AGS10NewI2cAddressAction : public Action<Ts...>, public Parented<AGS10Component> {

View File

@@ -38,6 +38,8 @@ static const uint8_t AHT10_STATUS_BUSY = 0x80;
static const float AHT10_DIVISOR = 1048576.0f; // 2^20, used for temperature and humidity calculations
void AHT10Component::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
if (this->write(AHT10_SOFTRESET_CMD, sizeof(AHT10_SOFTRESET_CMD)) != i2c::ERROR_OK) {
ESP_LOGE(TAG, "Reset failed");
}
@@ -78,6 +80,8 @@ void AHT10Component::setup() {
this->mark_failed();
return;
}
ESP_LOGV(TAG, "Initialization complete");
}
void AHT10Component::restart_read_() {
@@ -96,7 +100,7 @@ void AHT10Component::read_data_() {
ESP_LOGD(TAG, "Read attempt %d at %ums", this->read_count_, (unsigned) (millis() - this->start_time_));
}
if (this->read(data, 6) != i2c::ERROR_OK) {
this->status_set_warning(LOG_STR("Read failed, will retry"));
this->status_set_warning("Read failed, will retry");
this->restart_read_();
return;
}
@@ -113,7 +117,7 @@ void AHT10Component::read_data_() {
} else {
ESP_LOGD(TAG, "Invalid humidity, retrying");
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
this->status_set_warning(LOG_STR(ESP_LOG_MSG_COMM_FAIL));
this->status_set_warning(ESP_LOG_MSG_COMM_FAIL);
}
this->restart_read_();
return;
@@ -144,7 +148,7 @@ void AHT10Component::update() {
return;
this->start_time_ = millis();
if (this->write(AHT10_MEASURE_CMD, sizeof(AHT10_MEASURE_CMD)) != i2c::ERROR_OK) {
this->status_set_warning(LOG_STR(ESP_LOG_MSG_COMM_FAIL));
this->status_set_warning(ESP_LOG_MSG_COMM_FAIL);
return;
}
this->restart_read_();

View File

@@ -17,6 +17,8 @@ static const char *const TAG = "aic3204";
}
void AIC3204::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
// Set register page to 0
ERROR_CHECK(this->write_byte(AIC3204_PAGE_CTRL, 0x00), "Set page 0 failed");
// Initiate SW reset (PLL is powered off as part of reset)

View File

@@ -18,6 +18,6 @@ CONFIG_SCHEMA = cv.Schema(
).extend(esp32_ble_tracker.ESP_BLE_DEVICE_SCHEMA)
async def to_code(config):
def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
await esp32_ble_tracker.register_ble_device(var, config)
yield esp32_ble_tracker.register_ble_device(var, config)

View File

@@ -13,7 +13,7 @@ from esphome.const import (
CONF_TRIGGER_ID,
CONF_WEB_SERVER,
)
from esphome.core import CORE, CoroPriority, coroutine_with_priority
from esphome.core import CORE, coroutine_with_priority
from esphome.core.entity_helpers import entity_duplicate_validator, setup_entity
from esphome.cpp_generator import MockObjClass
@@ -301,7 +301,8 @@ async def alarm_action_disarm_to_code(config, action_id, template_arg, args):
)
async def alarm_action_pending_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_action(
@@ -309,7 +310,8 @@ async def alarm_action_pending_to_code(config, action_id, template_arg, args):
)
async def alarm_action_trigger_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_action(
@@ -317,7 +319,8 @@ async def alarm_action_trigger_to_code(config, action_id, template_arg, args):
)
async def alarm_action_chime_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_action(
@@ -330,7 +333,8 @@ async def alarm_action_chime_to_code(config, action_id, template_arg, args):
)
async def alarm_action_ready_to_code(config, action_id, template_arg, args):
paren = await cg.get_variable(config[CONF_ID])
return cg.new_Pvariable(action_id, template_arg, paren)
var = cg.new_Pvariable(action_id, template_arg, paren)
return var
@automation.register_condition(
@@ -345,6 +349,7 @@ async def alarm_control_panel_is_armed_to_code(
return cg.new_Pvariable(condition_id, template_arg, paren)
@coroutine_with_priority(CoroPriority.CORE)
@coroutine_with_priority(100.0)
async def to_code(config):
cg.add_global(alarm_control_panel_ns.using)
cg.add_define("USE_ALARM_CONTROL_PANEL")

View File

@@ -29,6 +29,22 @@ namespace am2315c {
static const char *const TAG = "am2315c";
uint8_t AM2315C::crc8_(uint8_t *data, uint8_t len) {
uint8_t crc = 0xFF;
while (len--) {
crc ^= *data++;
for (uint8_t i = 0; i < 8; i++) {
if (crc & 0x80) {
crc <<= 1;
crc ^= 0x31;
} else {
crc <<= 1;
}
}
}
return crc;
}
bool AM2315C::reset_register_(uint8_t reg) {
// code based on demo code sent by www.aosong.com
// no further documentation.
@@ -70,10 +86,12 @@ bool AM2315C::convert_(uint8_t *data, float &humidity, float &temperature) {
humidity = raw * 9.5367431640625e-5;
raw = ((data[3] & 0x0F) << 16) | (data[4] << 8) | data[5];
temperature = raw * 1.9073486328125e-4 - 50;
return crc8(data, 6, 0xFF, 0x31, true) == data[6];
return this->crc8_(data, 6) == data[6];
}
void AM2315C::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
// get status
uint8_t status = 0;
if (this->read(&status, 1) != i2c::ERROR_OK) {

View File

@@ -21,9 +21,9 @@
// SOFTWARE.
#pragma once
#include "esphome/components/i2c/i2c.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/core/component.h"
#include "esphome/components/sensor/sensor.h"
#include "esphome/components/i2c/i2c.h"
namespace esphome {
namespace am2315c {
@@ -39,6 +39,7 @@ class AM2315C : public PollingComponent, public i2c::I2CDevice {
void set_humidity_sensor(sensor::Sensor *humidity_sensor) { this->humidity_sensor_ = humidity_sensor; }
protected:
uint8_t crc8_(uint8_t *data, uint8_t len);
bool convert_(uint8_t *data, float &humidity, float &temperature);
bool reset_register_(uint8_t reg);

View File

@@ -34,6 +34,7 @@ void AM2320Component::update() {
this->status_clear_warning();
}
void AM2320Component::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
uint8_t data[8];
data[0] = 0;
data[1] = 4;

View File

@@ -34,20 +34,17 @@ SetFrameAction = animation_ns.class_(
"AnimationSetFrameAction", automation.Action, cg.Parented.template(Animation_)
)
CONFIG_SCHEMA = cv.All(
espImage.IMAGE_SCHEMA.extend(
{
cv.Required(CONF_ID): cv.declare_id(Animation_),
cv.Optional(CONF_LOOP): cv.All(
{
cv.Optional(CONF_START_FRAME, default=0): cv.positive_int,
cv.Optional(CONF_END_FRAME): cv.positive_int,
cv.Optional(CONF_REPEAT): cv.positive_int,
}
),
},
),
espImage.validate_settings,
CONFIG_SCHEMA = espImage.IMAGE_SCHEMA.extend(
{
cv.Required(CONF_ID): cv.declare_id(Animation_),
cv.Optional(CONF_LOOP): cv.All(
{
cv.Optional(CONF_START_FRAME, default=0): cv.positive_int,
cv.Optional(CONF_END_FRAME): cv.positive_int,
cv.Optional(CONF_REPEAT): cv.positive_int,
}
),
},
)

View File

@@ -26,12 +26,12 @@ uint32_t Animation::get_animation_frame_count() const { return this->animation_f
int Animation::get_current_frame() const { return this->current_frame_; }
void Animation::next_frame() {
this->current_frame_++;
if (loop_count_ && static_cast<uint32_t>(this->current_frame_) == loop_end_frame_ &&
if (loop_count_ && this->current_frame_ == loop_end_frame_ &&
(this->loop_current_iteration_ < loop_count_ || loop_count_ < 0)) {
this->current_frame_ = loop_start_frame_;
this->loop_current_iteration_++;
}
if (static_cast<uint32_t>(this->current_frame_) >= animation_frame_count_) {
if (this->current_frame_ >= animation_frame_count_) {
this->loop_current_iteration_ = 1;
this->current_frame_ = 0;
}

View File

@@ -54,6 +54,8 @@ enum { // APDS9306 registers
}
void APDS9306::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
uint8_t id;
if (!this->read_byte(APDS9306_PART_ID, &id)) { // Part ID register
this->error_code_ = COMMUNICATION_FAILED;
@@ -84,6 +86,8 @@ void APDS9306::setup() {
// Set to active mode
APDS9306_WRITE_BYTE(APDS9306_MAIN_CTRL, 0x02);
ESP_LOGCONFIG(TAG, "APDS9306 setup complete");
}
void APDS9306::dump_config() {

View File

@@ -15,6 +15,7 @@ static const char *const TAG = "apds9960";
#define APDS9960_WRITE_BYTE(reg, value) APDS9960_ERROR_CHECK(this->write_byte(reg, value));
void APDS9960::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
uint8_t id;
if (!this->read_byte(0x92, &id)) { // ID register
this->error_code_ = COMMUNICATION_FAILED;

View File

@@ -1,5 +1,4 @@
import base64
import logging
from esphome import automation
from esphome.automation import Condition
@@ -9,59 +8,34 @@ import esphome.config_validation as cv
from esphome.const import (
CONF_ACTION,
CONF_ACTIONS,
CONF_CAPTURE_RESPONSE,
CONF_DATA,
CONF_DATA_TEMPLATE,
CONF_EVENT,
CONF_ID,
CONF_KEY,
CONF_MAX_CONNECTIONS,
CONF_ON_CLIENT_CONNECTED,
CONF_ON_CLIENT_DISCONNECTED,
CONF_ON_ERROR,
CONF_ON_SUCCESS,
CONF_PASSWORD,
CONF_PORT,
CONF_REBOOT_TIMEOUT,
CONF_RESPONSE_TEMPLATE,
CONF_SERVICE,
CONF_SERVICES,
CONF_TAG,
CONF_TRIGGER_ID,
CONF_VARIABLES,
)
from esphome.core import CORE, ID, CoroPriority, coroutine_with_priority
from esphome.cpp_generator import TemplateArgsType
from esphome.types import ConfigType
_LOGGER = logging.getLogger(__name__)
from esphome.core import CORE, coroutine_with_priority
DOMAIN = "api"
DEPENDENCIES = ["network"]
CODEOWNERS = ["@esphome/core"]
def AUTO_LOAD(config: ConfigType) -> list[str]:
"""Conditionally auto-load json only when capture_response is used."""
base = ["socket"]
# Check if any homeassistant.action/homeassistant.service has capture_response: true
# This flag is set during config validation in _validate_response_config
if not config or CORE.data.get(DOMAIN, {}).get(CONF_CAPTURE_RESPONSE, False):
return base + ["json"]
return base
AUTO_LOAD = ["socket"]
CODEOWNERS = ["@OttoWinter"]
api_ns = cg.esphome_ns.namespace("api")
APIServer = api_ns.class_("APIServer", cg.Component, cg.Controller)
HomeAssistantServiceCallAction = api_ns.class_(
"HomeAssistantServiceCallAction", automation.Action
)
ActionResponse = api_ns.class_("ActionResponse")
HomeAssistantActionResponseTrigger = api_ns.class_(
"HomeAssistantActionResponseTrigger", automation.Trigger
)
APIConnectedCondition = api_ns.class_("APIConnectedCondition", Condition)
UserServiceTrigger = api_ns.class_("UserServiceTrigger", automation.Trigger)
@@ -79,10 +53,6 @@ SERVICE_ARG_NATIVE_TYPES = {
CONF_ENCRYPTION = "encryption"
CONF_BATCH_DELAY = "batch_delay"
CONF_CUSTOM_SERVICES = "custom_services"
CONF_HOMEASSISTANT_SERVICES = "homeassistant_services"
CONF_HOMEASSISTANT_STATES = "homeassistant_states"
CONF_LISTEN_BACKLOG = "listen_backlog"
CONF_MAX_SEND_QUEUE = "max_send_queue"
def validate_encryption_key(value):
@@ -129,32 +99,6 @@ def _encryption_schema(config):
return ENCRYPTION_SCHEMA(config)
def _validate_api_config(config: ConfigType) -> ConfigType:
"""Validate API configuration with mutual exclusivity check and deprecation warning."""
# Check if both password and encryption are configured
has_password = CONF_PASSWORD in config and config[CONF_PASSWORD]
has_encryption = CONF_ENCRYPTION in config
if has_password and has_encryption:
raise cv.Invalid(
"The 'password' and 'encryption' options are mutually exclusive. "
"The API client only supports one authentication method at a time. "
"Please remove one of them. "
"Note: 'password' authentication is deprecated and will be removed in version 2026.1.0. "
"We strongly recommend using 'encryption' instead for better security."
)
# Warn about password deprecation
if has_password:
_LOGGER.warning(
"API 'password' authentication has been deprecated since May 2022 and will be removed in version 2026.1.0. "
"Please migrate to the 'encryption' configuration. "
"See https://esphome.io/components/api.html#configuration-variables"
)
return config
CONFIG_SCHEMA = cv.All(
cv.Schema(
{
@@ -174,58 +118,19 @@ CONFIG_SCHEMA = cv.All(
cv.Range(max=cv.TimePeriod(milliseconds=65535)),
),
cv.Optional(CONF_CUSTOM_SERVICES, default=False): cv.boolean,
cv.Optional(CONF_HOMEASSISTANT_SERVICES, default=False): cv.boolean,
cv.Optional(CONF_HOMEASSISTANT_STATES, default=False): cv.boolean,
cv.Optional(CONF_ON_CLIENT_CONNECTED): automation.validate_automation(
single=True
),
cv.Optional(CONF_ON_CLIENT_DISCONNECTED): automation.validate_automation(
single=True
),
# Connection limits to prevent memory exhaustion on resource-constrained devices
# Each connection uses ~500-1000 bytes of RAM plus system resources
# Platform defaults based on available RAM and network stack implementation:
cv.SplitDefault(
CONF_LISTEN_BACKLOG,
esp8266=1, # Limited RAM (~40KB free), LWIP raw sockets
esp32=4, # More RAM (520KB), BSD sockets
rp2040=1, # Limited RAM (264KB), LWIP raw sockets like ESP8266
bk72xx=4, # Moderate RAM, BSD-style sockets
rtl87xx=4, # Moderate RAM, BSD-style sockets
host=4, # Abundant resources
ln882x=4, # Moderate RAM
): cv.int_range(min=1, max=10),
cv.SplitDefault(
CONF_MAX_CONNECTIONS,
esp8266=4, # ~40KB free RAM, each connection uses ~500-1000 bytes
esp32=8, # 520KB RAM available
rp2040=4, # 264KB RAM but LWIP constraints
bk72xx=8, # Moderate RAM
rtl87xx=8, # Moderate RAM
host=8, # Abundant resources
ln882x=8, # Moderate RAM
): cv.int_range(min=1, max=20),
# Maximum queued send buffers per connection before dropping connection
# Each buffer uses ~8-12 bytes overhead plus actual message size
# Platform defaults based on available RAM and typical message rates:
cv.SplitDefault(
CONF_MAX_SEND_QUEUE,
esp8266=5, # Limited RAM, need to fail fast
esp32=8, # More RAM, can buffer more
rp2040=5, # Limited RAM
bk72xx=8, # Moderate RAM
rtl87xx=8, # Moderate RAM
host=16, # Abundant resources
ln882x=8, # Moderate RAM
): cv.int_range(min=1, max=64),
}
).extend(cv.COMPONENT_SCHEMA),
cv.rename_key(CONF_SERVICES, CONF_ACTIONS),
_validate_api_config,
)
@coroutine_with_priority(CoroPriority.WEB)
@coroutine_with_priority(40.0)
async def to_code(config):
var = cg.new_Pvariable(config[CONF_ID])
await cg.register_component(var, config)
@@ -236,22 +141,11 @@ async def to_code(config):
cg.add(var.set_password(config[CONF_PASSWORD]))
cg.add(var.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
cg.add(var.set_batch_delay(config[CONF_BATCH_DELAY]))
if CONF_LISTEN_BACKLOG in config:
cg.add(var.set_listen_backlog(config[CONF_LISTEN_BACKLOG]))
if CONF_MAX_CONNECTIONS in config:
cg.add(var.set_max_connections(config[CONF_MAX_CONNECTIONS]))
cg.add_define("API_MAX_SEND_QUEUE", config[CONF_MAX_SEND_QUEUE])
# Set USE_API_SERVICES if any services are enabled
if config.get(CONF_ACTIONS) or config[CONF_CUSTOM_SERVICES]:
cg.add_define("USE_API_SERVICES")
if config[CONF_HOMEASSISTANT_SERVICES]:
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
if config[CONF_HOMEASSISTANT_STATES]:
cg.add_define("USE_API_HOMEASSISTANT_STATES")
if actions := config.get(CONF_ACTIONS, []):
for conf in actions:
template_args = []
@@ -289,7 +183,6 @@ async def to_code(config):
if key := encryption_config.get(CONF_KEY):
decoded = base64.b64decode(key)
cg.add(var.set_noise_psk(list(decoded)))
cg.add_define("USE_API_NOISE_PSK_FROM_YAML")
else:
# No key provided, but encryption desired
# This will allow a plaintext client to provide a noise key,
@@ -309,29 +202,6 @@ async def to_code(config):
KEY_VALUE_SCHEMA = cv.Schema({cv.string: cv.templatable(cv.string_strict)})
def _validate_response_config(config: ConfigType) -> ConfigType:
# Validate dependencies:
# - response_template requires capture_response: true
# - capture_response: true requires on_success
if CONF_RESPONSE_TEMPLATE in config and not config[CONF_CAPTURE_RESPONSE]:
raise cv.Invalid(
f"`{CONF_RESPONSE_TEMPLATE}` requires `{CONF_CAPTURE_RESPONSE}: true` to be set.",
path=[CONF_RESPONSE_TEMPLATE],
)
if config[CONF_CAPTURE_RESPONSE] and CONF_ON_SUCCESS not in config:
raise cv.Invalid(
f"`{CONF_CAPTURE_RESPONSE}: true` requires `{CONF_ON_SUCCESS}` to be set.",
path=[CONF_CAPTURE_RESPONSE],
)
# Track if any action uses capture_response for AUTO_LOAD
if config[CONF_CAPTURE_RESPONSE]:
CORE.data.setdefault(DOMAIN, {})[CONF_CAPTURE_RESPONSE] = True
return config
HOMEASSISTANT_ACTION_ACTION_SCHEMA = cv.All(
cv.Schema(
{
@@ -347,15 +217,10 @@ HOMEASSISTANT_ACTION_ACTION_SCHEMA = cv.All(
cv.Optional(CONF_VARIABLES, default={}): cv.Schema(
{cv.string: cv.returning_lambda}
),
cv.Optional(CONF_RESPONSE_TEMPLATE): cv.templatable(cv.string),
cv.Optional(CONF_CAPTURE_RESPONSE, default=False): cv.boolean,
cv.Optional(CONF_ON_SUCCESS): automation.validate_automation(single=True),
cv.Optional(CONF_ON_ERROR): automation.validate_automation(single=True),
}
),
cv.has_exactly_one_key(CONF_SERVICE, CONF_ACTION),
cv.rename_key(CONF_SERVICE, CONF_ACTION),
_validate_response_config,
)
@@ -369,13 +234,7 @@ HOMEASSISTANT_ACTION_ACTION_SCHEMA = cv.All(
HomeAssistantServiceCallAction,
HOMEASSISTANT_ACTION_ACTION_SCHEMA,
)
async def homeassistant_service_to_code(
config: ConfigType,
action_id: ID,
template_arg: cg.TemplateArguments,
args: TemplateArgsType,
):
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
async def homeassistant_service_to_code(config, action_id, template_arg, args):
serv = await cg.get_variable(config[CONF_ID])
var = cg.new_Pvariable(action_id, template_arg, serv, False)
templ = await cg.templatable(config[CONF_ACTION], args, None)
@@ -389,40 +248,6 @@ async def homeassistant_service_to_code(
for key, value in config[CONF_VARIABLES].items():
templ = await cg.templatable(value, args, None)
cg.add(var.add_variable(key, templ))
if on_error := config.get(CONF_ON_ERROR):
cg.add_define("USE_API_HOMEASSISTANT_ACTION_RESPONSES")
cg.add_define("USE_API_HOMEASSISTANT_ACTION_RESPONSES_ERRORS")
cg.add(var.set_wants_status())
await automation.build_automation(
var.get_error_trigger(),
[(cg.std_string, "error"), *args],
on_error,
)
if on_success := config.get(CONF_ON_SUCCESS):
cg.add_define("USE_API_HOMEASSISTANT_ACTION_RESPONSES")
cg.add(var.set_wants_status())
if config[CONF_CAPTURE_RESPONSE]:
cg.add(var.set_wants_response())
cg.add_define("USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON")
await automation.build_automation(
var.get_success_trigger_with_response(),
[(cg.JsonObjectConst, "response"), *args],
on_success,
)
if response_template := config.get(CONF_RESPONSE_TEMPLATE):
templ = await cg.templatable(response_template, args, cg.std_string)
cg.add(var.set_response_template(templ))
else:
await automation.build_automation(
var.get_success_trigger(),
args,
on_success,
)
return var
@@ -453,7 +278,6 @@ HOMEASSISTANT_EVENT_ACTION_SCHEMA = cv.Schema(
HOMEASSISTANT_EVENT_ACTION_SCHEMA,
)
async def homeassistant_event_to_code(config, action_id, template_arg, args):
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
serv = await cg.get_variable(config[CONF_ID])
var = cg.new_Pvariable(action_id, template_arg, serv, True)
templ = await cg.templatable(config[CONF_EVENT], args, None)
@@ -485,7 +309,6 @@ HOMEASSISTANT_TAG_SCANNED_ACTION_SCHEMA = cv.maybe_simple_value(
HOMEASSISTANT_TAG_SCANNED_ACTION_SCHEMA,
)
async def homeassistant_tag_scanned_to_code(config, action_id, template_arg, args):
cg.add_define("USE_API_HOMEASSISTANT_SERVICES")
serv = await cg.get_variable(config[CONF_ID])
var = cg.new_Pvariable(action_id, template_arg, serv, True)
cg.add(var.set_service("esphome.tag_scanned"))
@@ -500,10 +323,9 @@ async def api_connected_to_code(config, condition_id, template_arg, args):
def FILTER_SOURCE_FILES() -> list[str]:
"""Filter out api_pb2_dump.cpp when proto message dumping is not enabled,
user_services.cpp when no services are defined, and protocol-specific
implementations based on encryption configuration."""
files_to_filter: list[str] = []
"""Filter out api_pb2_dump.cpp when proto message dumping is not enabled
and user_services.cpp when no services are defined."""
files_to_filter = []
# api_pb2_dump.cpp is only needed when HAS_PROTO_MESSAGE_DUMP is defined
# This is a particularly large file that still needs to be opened and read
@@ -519,16 +341,4 @@ def FILTER_SOURCE_FILES() -> list[str]:
if config and not config.get(CONF_ACTIONS) and not config[CONF_CUSTOM_SERVICES]:
files_to_filter.append("user_services.cpp")
# Filter protocol-specific implementations based on encryption configuration
encryption_config = config.get(CONF_ENCRYPTION) if config else None
# If encryption is not configured at all, we only need plaintext
if encryption_config is None:
files_to_filter.append("api_frame_helper_noise.cpp")
# If encryption is configured with a key, we only need noise
elif encryption_config.get(CONF_KEY):
files_to_filter.append("api_frame_helper_plaintext.cpp")
# If encryption is configured but no key is provided, we need both
# (this allows a plaintext client to provide a noise key)
return files_to_filter

View File

@@ -7,7 +7,7 @@ service APIConnection {
option (needs_setup_connection) = false;
option (needs_authentication) = false;
}
rpc authenticate (AuthenticationRequest) returns (AuthenticationResponse) {
rpc connect (ConnectRequest) returns (ConnectResponse) {
option (needs_setup_connection) = false;
option (needs_authentication) = false;
}
@@ -27,6 +27,9 @@ service APIConnection {
rpc subscribe_logs (SubscribeLogsRequest) returns (void) {}
rpc subscribe_homeassistant_services (SubscribeHomeassistantServicesRequest) returns (void) {}
rpc subscribe_home_assistant_states (SubscribeHomeAssistantStatesRequest) returns (void) {}
rpc get_time (GetTimeRequest) returns (GetTimeResponse) {
option (needs_authentication) = false;
}
rpc execute_service (ExecuteServiceRequest) returns (void) {}
rpc noise_encryption_set_key (NoiseEncryptionSetKeyRequest) returns (NoiseEncryptionSetKeyResponse) {}
@@ -66,9 +69,6 @@ service APIConnection {
rpc voice_assistant_set_configuration(VoiceAssistantSetConfiguration) returns (void) {}
rpc alarm_control_panel_command (AlarmControlPanelCommandRequest) returns (void) {}
rpc zwave_proxy_frame(ZWaveProxyFrame) returns (void) {}
rpc zwave_proxy_request(ZWaveProxyRequest) returns (void) {}
}
@@ -102,7 +102,7 @@ message HelloRequest {
// For example "Home Assistant"
// Not strictly necessary to send but nice for debugging
// purposes.
string client_info = 1 [(pointer_to_buffer) = true];
string client_info = 1;
uint32 api_version_major = 2;
uint32 api_version_minor = 3;
}
@@ -132,23 +132,21 @@ message HelloResponse {
// Message sent at the beginning of each connection to authenticate the client
// Can only be sent by the client and only at the beginning of the connection
message AuthenticationRequest {
message ConnectRequest {
option (id) = 3;
option (source) = SOURCE_CLIENT;
option (no_delay) = true;
option (ifdef) = "USE_API_PASSWORD";
// The password to log in with
string password = 1 [(pointer_to_buffer) = true];
string password = 1;
}
// Confirmation of successful connection. After this the connection is available for all traffic.
// Can only be sent by the server and only at the beginning of the connection
message AuthenticationResponse {
message ConnectResponse {
option (id) = 4;
option (source) = SOURCE_SERVER;
option (no_delay) = true;
option (ifdef) = "USE_API_PASSWORD";
bool invalid_password = 1;
}
@@ -205,7 +203,7 @@ message DeviceInfoResponse {
option (id) = 10;
option (source) = SOURCE_SERVER;
bool uses_password = 1 [(field_ifdef) = "USE_API_PASSWORD"];
bool uses_password = 1;
// The name of the node, given by "App.set_name()"
string name = 2;
@@ -232,16 +230,14 @@ message DeviceInfoResponse {
uint32 webserver_port = 10 [(field_ifdef) = "USE_WEBSERVER"];
// Deprecated in API version 1.9
uint32 legacy_bluetooth_proxy_version = 11 [deprecated=true, (field_ifdef) = "USE_BLUETOOTH_PROXY"];
uint32 legacy_bluetooth_proxy_version = 11 [(field_ifdef) = "USE_BLUETOOTH_PROXY"];
uint32 bluetooth_proxy_feature_flags = 15 [(field_ifdef) = "USE_BLUETOOTH_PROXY"];
string manufacturer = 12;
string friendly_name = 13;
// Deprecated in API version 1.10
uint32 legacy_voice_assistant_version = 14 [deprecated=true, (field_ifdef) = "USE_VOICE_ASSISTANT"];
uint32 legacy_voice_assistant_version = 14 [(field_ifdef) = "USE_VOICE_ASSISTANT"];
uint32 voice_assistant_feature_flags = 17 [(field_ifdef) = "USE_VOICE_ASSISTANT"];
string suggested_area = 16 [(field_ifdef) = "USE_AREAS"];
@@ -252,15 +248,11 @@ message DeviceInfoResponse {
// Supports receiving and saving api encryption key
bool api_encryption_supported = 19 [(field_ifdef) = "USE_API_NOISE"];
repeated DeviceInfo devices = 20 [(field_ifdef) = "USE_DEVICES", (fixed_array_size_define) = "ESPHOME_DEVICE_COUNT"];
repeated AreaInfo areas = 21 [(field_ifdef) = "USE_AREAS", (fixed_array_size_define) = "ESPHOME_AREA_COUNT"];
repeated DeviceInfo devices = 20 [(field_ifdef) = "USE_DEVICES"];
repeated AreaInfo areas = 21 [(field_ifdef) = "USE_AREAS"];
// Top-level area info to phase out suggested_area
AreaInfo area = 22 [(field_ifdef) = "USE_AREAS"];
// Indicates if Z-Wave proxy support is available and features supported
uint32 zwave_proxy_feature_flags = 23 [(field_ifdef) = "USE_ZWAVE_PROXY"];
uint32 zwave_home_id = 24 [(field_ifdef) = "USE_ZWAVE_PROXY"];
}
message ListEntitiesRequest {
@@ -345,9 +337,7 @@ message ListEntitiesCoverResponse {
uint32 device_id = 13 [(field_ifdef) = "USE_DEVICES"];
}
// Deprecated in API version 1.1
enum LegacyCoverState {
option deprecated = true;
LEGACY_COVER_STATE_OPEN = 0;
LEGACY_COVER_STATE_CLOSED = 1;
}
@@ -366,8 +356,7 @@ message CoverStateResponse {
fixed32 key = 1;
// legacy: state has been removed in 1.13
// clients/servers must still send/accept it until the next protocol change
// Deprecated in API version 1.1
LegacyCoverState legacy_state = 2 [deprecated=true];
LegacyCoverState legacy_state = 2;
float position = 3;
float tilt = 4;
@@ -375,9 +364,7 @@ message CoverStateResponse {
uint32 device_id = 6 [(field_ifdef) = "USE_DEVICES"];
}
// Deprecated in API version 1.1
enum LegacyCoverCommand {
option deprecated = true;
LEGACY_COVER_COMMAND_OPEN = 0;
LEGACY_COVER_COMMAND_CLOSE = 1;
LEGACY_COVER_COMMAND_STOP = 2;
@@ -393,10 +380,8 @@ message CoverCommandRequest {
// legacy: command has been removed in 1.13
// clients/servers must still send/accept it until the next protocol change
// Deprecated in API version 1.1
bool has_legacy_command = 2 [deprecated=true];
// Deprecated in API version 1.1
LegacyCoverCommand legacy_command = 3 [deprecated=true];
bool has_legacy_command = 2;
LegacyCoverCommand legacy_command = 3;
bool has_position = 4;
float position = 5;
@@ -425,12 +410,10 @@ message ListEntitiesFanResponse {
bool disabled_by_default = 9;
string icon = 10 [(field_ifdef) = "USE_ENTITY_ICON"];
EntityCategory entity_category = 11;
repeated string supported_preset_modes = 12 [(container_pointer) = "std::set"];
repeated string supported_preset_modes = 12;
uint32 device_id = 13 [(field_ifdef) = "USE_DEVICES"];
}
// Deprecated in API version 1.6 - only used in deprecated fields
enum FanSpeed {
option deprecated = true;
FAN_SPEED_LOW = 0;
FAN_SPEED_MEDIUM = 1;
FAN_SPEED_HIGH = 2;
@@ -449,8 +432,7 @@ message FanStateResponse {
fixed32 key = 1;
bool state = 2;
bool oscillating = 3;
// Deprecated in API version 1.6
FanSpeed speed = 4 [deprecated=true];
FanSpeed speed = 4 [deprecated = true];
FanDirection direction = 5;
int32 speed_level = 6;
string preset_mode = 7;
@@ -466,10 +448,8 @@ message FanCommandRequest {
fixed32 key = 1;
bool has_state = 2;
bool state = 3;
// Deprecated in API version 1.6
bool has_speed = 4 [deprecated=true];
// Deprecated in API version 1.6
FanSpeed speed = 5 [deprecated=true];
bool has_speed = 4 [deprecated = true];
FanSpeed speed = 5 [deprecated = true];
bool has_oscillating = 6;
bool oscillating = 7;
bool has_direction = 8;
@@ -506,15 +486,11 @@ message ListEntitiesLightResponse {
string name = 3;
reserved 4; // Deprecated: was string unique_id
repeated ColorMode supported_color_modes = 12 [(container_pointer) = "std::set<light::ColorMode>"];
repeated ColorMode supported_color_modes = 12;
// next four supports_* are for legacy clients, newer clients should use color modes
// Deprecated in API version 1.6
bool legacy_supports_brightness = 5 [deprecated=true];
// Deprecated in API version 1.6
bool legacy_supports_rgb = 6 [deprecated=true];
// Deprecated in API version 1.6
bool legacy_supports_white_value = 7 [deprecated=true];
// Deprecated in API version 1.6
bool legacy_supports_color_temperature = 8 [deprecated=true];
float min_mireds = 9;
float max_mireds = 10;
@@ -591,9 +567,7 @@ enum SensorStateClass {
STATE_CLASS_TOTAL = 3;
}
// Deprecated in API version 1.5
enum SensorLastResetType {
option deprecated = true;
LAST_RESET_NONE = 0;
LAST_RESET_NEVER = 1;
LAST_RESET_AUTO = 2;
@@ -617,8 +591,7 @@ message ListEntitiesSensorResponse {
string device_class = 9;
SensorStateClass state_class = 10;
// Last reset type removed in 2021.9.0
// Deprecated in API version 1.5
SensorLastResetType legacy_last_reset_type = 11 [deprecated=true];
SensorLastResetType legacy_last_reset_type = 11;
bool disabled_by_default = 12;
EntityCategory entity_category = 13;
uint32 device_id = 14 [(field_ifdef) = "USE_DEVICES"];
@@ -738,6 +711,7 @@ message SubscribeLogsResponse {
LogLevel level = 1;
bytes message = 3;
bool send_failed = 4;
}
// ==================== NOISE ENCRYPTION ====================
@@ -761,41 +735,23 @@ message NoiseEncryptionSetKeyResponse {
message SubscribeHomeassistantServicesRequest {
option (id) = 34;
option (source) = SOURCE_CLIENT;
option (ifdef) = "USE_API_HOMEASSISTANT_SERVICES";
}
message HomeassistantServiceMap {
string key = 1;
string value = 2 [(no_zero_copy) = true];
string value = 2;
}
message HomeassistantActionRequest {
message HomeassistantServiceResponse {
option (id) = 35;
option (source) = SOURCE_SERVER;
option (no_delay) = true;
option (ifdef) = "USE_API_HOMEASSISTANT_SERVICES";
string service = 1;
repeated HomeassistantServiceMap data = 2;
repeated HomeassistantServiceMap data_template = 3;
repeated HomeassistantServiceMap variables = 4;
bool is_event = 5;
uint32 call_id = 6 [(field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES"];
bool wants_response = 7 [(field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON"];
string response_template = 8 [(no_zero_copy) = true, (field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON"];
}
// Message sent by Home Assistant to ESPHome with service call response data
message HomeassistantActionResponse {
option (id) = 130;
option (source) = SOURCE_CLIENT;
option (no_delay) = true;
option (ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES";
uint32 call_id = 1; // Matches the call_id from HomeassistantActionRequest
bool success = 2; // Whether the service call succeeded
string error_message = 3; // Error message if success = false
bytes response_data = 4 [(pointer_to_buffer) = true, (field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON"];
}
// ==================== IMPORT HOME ASSISTANT STATES ====================
@@ -805,13 +761,11 @@ message HomeassistantActionResponse {
message SubscribeHomeAssistantStatesRequest {
option (id) = 38;
option (source) = SOURCE_CLIENT;
option (ifdef) = "USE_API_HOMEASSISTANT_STATES";
}
message SubscribeHomeAssistantStateResponse {
option (id) = 39;
option (source) = SOURCE_SERVER;
option (ifdef) = "USE_API_HOMEASSISTANT_STATES";
string entity_id = 1;
string attribute = 2;
bool once = 3;
@@ -821,7 +775,6 @@ message HomeAssistantStateResponse {
option (id) = 40;
option (source) = SOURCE_CLIENT;
option (no_delay) = true;
option (ifdef) = "USE_API_HOMEASSISTANT_STATES";
string entity_id = 1;
string state = 2;
@@ -831,16 +784,15 @@ message HomeAssistantStateResponse {
// ==================== IMPORT TIME ====================
message GetTimeRequest {
option (id) = 36;
option (source) = SOURCE_SERVER;
option (source) = SOURCE_BOTH;
}
message GetTimeResponse {
option (id) = 37;
option (source) = SOURCE_CLIENT;
option (source) = SOURCE_BOTH;
option (no_delay) = true;
fixed32 epoch_seconds = 1;
string timezone = 2 [(pointer_to_buffer) = true];
}
// ==================== USER-DEFINES SERVICES ====================
@@ -989,20 +941,19 @@ message ListEntitiesClimateResponse {
bool supports_current_temperature = 5;
bool supports_two_point_target_temperature = 6;
repeated ClimateMode supported_modes = 7 [(container_pointer) = "std::set<climate::ClimateMode>"];
repeated ClimateMode supported_modes = 7;
float visual_min_temperature = 8;
float visual_max_temperature = 9;
float visual_target_temperature_step = 10;
// for older peer versions - in new system this
// is if CLIMATE_PRESET_AWAY exists is supported_presets
// Deprecated in API version 1.5
bool legacy_supports_away = 11 [deprecated=true];
bool legacy_supports_away = 11;
bool supports_action = 12;
repeated ClimateFanMode supported_fan_modes = 13 [(container_pointer) = "std::set<climate::ClimateFanMode>"];
repeated ClimateSwingMode supported_swing_modes = 14 [(container_pointer) = "std::set<climate::ClimateSwingMode>"];
repeated string supported_custom_fan_modes = 15 [(container_pointer) = "std::set"];
repeated ClimatePreset supported_presets = 16 [(container_pointer) = "std::set<climate::ClimatePreset>"];
repeated string supported_custom_presets = 17 [(container_pointer) = "std::set"];
repeated ClimateFanMode supported_fan_modes = 13;
repeated ClimateSwingMode supported_swing_modes = 14;
repeated string supported_custom_fan_modes = 15;
repeated ClimatePreset supported_presets = 16;
repeated string supported_custom_presets = 17;
bool disabled_by_default = 18;
string icon = 19 [(field_ifdef) = "USE_ENTITY_ICON"];
EntityCategory entity_category = 20;
@@ -1027,8 +978,7 @@ message ClimateStateResponse {
float target_temperature_low = 5;
float target_temperature_high = 6;
// For older peers, equal to preset == CLIMATE_PRESET_AWAY
// Deprecated in API version 1.5
bool unused_legacy_away = 7 [deprecated=true];
bool unused_legacy_away = 7;
ClimateAction action = 8;
ClimateFanMode fan_mode = 9;
ClimateSwingMode swing_mode = 10;
@@ -1056,10 +1006,8 @@ message ClimateCommandRequest {
bool has_target_temperature_high = 8;
float target_temperature_high = 9;
// legacy, for older peers, newer ones should use CLIMATE_PRESET_AWAY in preset
// Deprecated in API version 1.5
bool unused_has_legacy_away = 10 [deprecated=true];
// Deprecated in API version 1.5
bool unused_legacy_away = 11 [deprecated=true];
bool unused_has_legacy_away = 10;
bool unused_legacy_away = 11;
bool has_fan_mode = 12;
ClimateFanMode fan_mode = 13;
bool has_swing_mode = 14;
@@ -1142,7 +1090,7 @@ message ListEntitiesSelectResponse {
reserved 4; // Deprecated: was string unique_id
string icon = 5 [(field_ifdef) = "USE_ENTITY_ICON"];
repeated string options = 6 [(container_pointer) = "std::vector"];
repeated string options = 6;
bool disabled_by_default = 7;
EntityCategory entity_category = 8;
uint32 device_id = 9 [(field_ifdef) = "USE_DEVICES"];
@@ -1320,9 +1268,6 @@ enum MediaPlayerState {
MEDIA_PLAYER_STATE_IDLE = 1;
MEDIA_PLAYER_STATE_PLAYING = 2;
MEDIA_PLAYER_STATE_PAUSED = 3;
MEDIA_PLAYER_STATE_ANNOUNCING = 4;
MEDIA_PLAYER_STATE_OFF = 5;
MEDIA_PLAYER_STATE_ON = 6;
}
enum MediaPlayerCommand {
MEDIA_PLAYER_COMMAND_PLAY = 0;
@@ -1330,15 +1275,6 @@ enum MediaPlayerCommand {
MEDIA_PLAYER_COMMAND_STOP = 2;
MEDIA_PLAYER_COMMAND_MUTE = 3;
MEDIA_PLAYER_COMMAND_UNMUTE = 4;
MEDIA_PLAYER_COMMAND_TOGGLE = 5;
MEDIA_PLAYER_COMMAND_VOLUME_UP = 6;
MEDIA_PLAYER_COMMAND_VOLUME_DOWN = 7;
MEDIA_PLAYER_COMMAND_ENQUEUE = 8;
MEDIA_PLAYER_COMMAND_REPEAT_ONE = 9;
MEDIA_PLAYER_COMMAND_REPEAT_OFF = 10;
MEDIA_PLAYER_COMMAND_CLEAR_PLAYLIST = 11;
MEDIA_PLAYER_COMMAND_TURN_ON = 12;
MEDIA_PLAYER_COMMAND_TURN_OFF = 13;
}
enum MediaPlayerFormatPurpose {
MEDIA_PLAYER_FORMAT_PURPOSE_DEFAULT = 0;
@@ -1373,8 +1309,6 @@ message ListEntitiesMediaPlayerResponse {
repeated MediaPlayerSupportedFormat supported_formats = 9;
uint32 device_id = 10 [(field_ifdef) = "USE_DEVICES"];
uint32 feature_flags = 11;
}
message MediaPlayerStateResponse {
option (id) = 64;
@@ -1420,17 +1354,12 @@ message SubscribeBluetoothLEAdvertisementsRequest {
uint32 flags = 1;
}
// Deprecated - only used by deprecated BluetoothLEAdvertisementResponse
message BluetoothServiceData {
option deprecated = true;
string uuid = 1;
// Deprecated in API version 1.7
repeated uint32 legacy_data = 2 [deprecated=true]; // Removed in api version 1.7
repeated uint32 legacy_data = 2 [deprecated = true]; // Removed in api version 1.7
bytes data = 3; // Added in api version 1.7
}
// Removed in ESPHome 2025.8.0 - use BluetoothLERawAdvertisementsResponse instead
message BluetoothLEAdvertisementResponse {
option deprecated = true;
option (id) = 67;
option (source) = SOURCE_SERVER;
option (ifdef) = "USE_BLUETOOTH_PROXY";
@@ -1461,11 +1390,11 @@ message BluetoothLERawAdvertisementsResponse {
option (ifdef) = "USE_BLUETOOTH_PROXY";
option (no_delay) = true;
repeated BluetoothLERawAdvertisement advertisements = 1 [(fixed_array_with_length_define) = "BLUETOOTH_PROXY_ADVERTISEMENT_BATCH_SIZE"];
repeated BluetoothLERawAdvertisement advertisements = 1;
}
enum BluetoothDeviceRequestType {
BLUETOOTH_DEVICE_REQUEST_TYPE_CONNECT = 0 [deprecated = true]; // V1 removed, use V3 variants
BLUETOOTH_DEVICE_REQUEST_TYPE_CONNECT = 0;
BLUETOOTH_DEVICE_REQUEST_TYPE_DISCONNECT = 1;
BLUETOOTH_DEVICE_REQUEST_TYPE_PAIR = 2;
BLUETOOTH_DEVICE_REQUEST_TYPE_UNPAIR = 3;
@@ -1481,7 +1410,7 @@ message BluetoothDeviceRequest {
uint64 address = 1;
BluetoothDeviceRequestType request_type = 2;
bool has_address_type = 3; // Deprecated, should be removed in 2027.8 - https://github.com/esphome/esphome/pull/10318
bool has_address_type = 3;
uint32 address_type = 4;
}
@@ -1505,39 +1434,21 @@ message BluetoothGATTGetServicesRequest {
}
message BluetoothGATTDescriptor {
repeated uint64 uuid = 1 [(fixed_array_size) = 2, (fixed_array_skip_zero) = true];
repeated uint64 uuid = 1;
uint32 handle = 2;
// New field for efficient UUID (v1.12+)
// Only one of uuid or short_uuid will be set.
// short_uuid is used for both 16-bit and 32-bit UUIDs with v1.12+ clients.
// 128-bit UUIDs always use the uuid field for backwards compatibility.
uint32 short_uuid = 3; // 16-bit or 32-bit UUID
}
message BluetoothGATTCharacteristic {
repeated uint64 uuid = 1 [(fixed_array_size) = 2, (fixed_array_skip_zero) = true];
repeated uint64 uuid = 1;
uint32 handle = 2;
uint32 properties = 3;
repeated BluetoothGATTDescriptor descriptors = 4;
// New field for efficient UUID (v1.12+)
// Only one of uuid or short_uuid will be set.
// short_uuid is used for both 16-bit and 32-bit UUIDs with v1.12+ clients.
// 128-bit UUIDs always use the uuid field for backwards compatibility.
uint32 short_uuid = 5; // 16-bit or 32-bit UUID
}
message BluetoothGATTService {
repeated uint64 uuid = 1 [(fixed_array_size) = 2, (fixed_array_skip_zero) = true];
repeated uint64 uuid = 1;
uint32 handle = 2;
repeated BluetoothGATTCharacteristic characteristics = 3;
// New field for efficient UUID (v1.12+)
// Only one of uuid or short_uuid will be set.
// short_uuid is used for both 16-bit and 32-bit UUIDs with v1.12+ clients.
// 128-bit UUIDs always use the uuid field for backwards compatibility.
uint32 short_uuid = 4; // 16-bit or 32-bit UUID
}
message BluetoothGATTGetServicesResponse {
@@ -1587,7 +1498,7 @@ message BluetoothGATTWriteRequest {
uint32 handle = 2;
bool response = 3;
bytes data = 4 [(pointer_to_buffer) = true];
bytes data = 4;
}
message BluetoothGATTReadDescriptorRequest {
@@ -1607,7 +1518,7 @@ message BluetoothGATTWriteDescriptorRequest {
uint64 address = 1;
uint32 handle = 2;
bytes data = 3 [(pointer_to_buffer) = true];
bytes data = 3;
}
message BluetoothGATTNotifyRequest {
@@ -1644,10 +1555,7 @@ message BluetoothConnectionsFreeResponse {
uint32 free = 1;
uint32 limit = 2;
repeated uint64 allocated = 3 [
(fixed_array_size_define) = "BLUETOOTH_PROXY_MAX_CONNECTIONS",
(fixed_array_skip_zero) = true
];
repeated uint64 allocated = 3;
}
message BluetoothGATTErrorResponse {
@@ -1735,7 +1643,6 @@ message BluetoothScannerStateResponse {
BluetoothScannerState state = 1;
BluetoothScannerMode mode = 2;
BluetoothScannerMode configured_mode = 3;
}
message BluetoothScannerSetModeRequest {
@@ -1881,22 +1788,10 @@ message VoiceAssistantWakeWord {
repeated string trained_languages = 3;
}
message VoiceAssistantExternalWakeWord {
string id = 1;
string wake_word = 2;
repeated string trained_languages = 3;
string model_type = 4;
uint32 model_size = 5;
string model_hash = 6;
string url = 7;
}
message VoiceAssistantConfigurationRequest {
option (id) = 121;
option (source) = SOURCE_CLIENT;
option (ifdef) = "USE_VOICE_ASSISTANT";
repeated VoiceAssistantExternalWakeWord external_wake_words = 1;
}
message VoiceAssistantConfigurationResponse {
@@ -1905,7 +1800,7 @@ message VoiceAssistantConfigurationResponse {
option (ifdef) = "USE_VOICE_ASSISTANT";
repeated VoiceAssistantWakeWord available_wake_words = 1;
repeated string active_wake_words = 2 [(container_pointer) = "std::vector"];
repeated string active_wake_words = 2;
uint32 max_active_wake_words = 3;
}
@@ -2311,28 +2206,3 @@ message UpdateCommandRequest {
UpdateCommand command = 2;
uint32 device_id = 3 [(field_ifdef) = "USE_DEVICES"];
}
// ==================== Z-WAVE ====================
message ZWaveProxyFrame {
option (id) = 128;
option (source) = SOURCE_BOTH;
option (ifdef) = "USE_ZWAVE_PROXY";
option (no_delay) = true;
bytes data = 1 [(pointer_to_buffer) = true];
}
enum ZWaveProxyRequestType {
ZWAVE_PROXY_REQUEST_TYPE_SUBSCRIBE = 0;
ZWAVE_PROXY_REQUEST_TYPE_UNSUBSCRIBE = 1;
ZWAVE_PROXY_REQUEST_TYPE_HOME_ID_CHANGE = 2;
}
message ZWaveProxyRequest {
option (id) = 129;
option (source) = SOURCE_BOTH;
option (ifdef) = "USE_ZWAVE_PROXY";
ZWaveProxyRequestType type = 1;
bytes data = 2 [(pointer_to_buffer) = true];
}

File diff suppressed because it is too large Load Diff

View File

@@ -10,33 +10,18 @@
#include "esphome/core/component.h"
#include "esphome/core/entity_base.h"
#include <functional>
#include <vector>
#include <functional>
namespace esphome::api {
// Client information structure
struct ClientInfo {
std::string name; // Client name from Hello message
std::string peername; // IP:port from socket
};
namespace esphome {
namespace api {
// Keepalive timeout in milliseconds
static constexpr uint32_t KEEPALIVE_TIMEOUT_MS = 60000;
// Maximum number of entities to process in a single batch during initial state/info sending
// This was increased from 20 to 24 after removing the unique_id field from entity info messages,
// which reduced message sizes allowing more entities per batch without exceeding packet limits
static constexpr size_t MAX_INITIAL_PER_BATCH = 24;
// Maximum number of packets to process in a single batch (platform-dependent)
// This limit exists to prevent stack overflow from the PacketInfo array in process_batch_
// Each PacketInfo is 8 bytes, so 64 * 8 = 512 bytes, 32 * 8 = 256 bytes
#if defined(USE_ESP32) || defined(USE_HOST)
static constexpr size_t MAX_PACKETS_PER_BATCH = 64; // ESP32 has 8KB+ stack, HOST has plenty
#else
static constexpr size_t MAX_PACKETS_PER_BATCH = 32; // ESP8266/RP2040/etc have smaller stacks
#endif
static constexpr size_t MAX_INITIAL_PER_BATCH = 20;
class APIConnection final : public APIServerConnection {
class APIConnection : public APIServerConnection {
public:
friend class APIServer;
friend class ListEntitiesIterator;
@@ -123,19 +108,15 @@ class APIConnection final : public APIServerConnection {
void media_player_command(const MediaPlayerCommandRequest &msg) override;
#endif
bool try_send_log_message(int level, const char *tag, const char *line, size_t message_len);
#ifdef USE_API_HOMEASSISTANT_SERVICES
void send_homeassistant_action(const HomeassistantActionRequest &call) {
void send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
if (!this->flags_.service_call_subscription)
return;
this->send_message(call, HomeassistantActionRequest::MESSAGE_TYPE);
this->send_message(call, HomeassistantServiceResponse::MESSAGE_TYPE);
}
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
void on_homeassistant_action_response(const HomeassistantActionResponse &msg) override;
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
#endif // USE_API_HOMEASSISTANT_SERVICES
#ifdef USE_BLUETOOTH_PROXY
void subscribe_bluetooth_le_advertisements(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
void unsubscribe_bluetooth_le_advertisements(const UnsubscribeBluetoothLEAdvertisementsRequest &msg) override;
bool send_bluetooth_le_advertisement(const BluetoothLEAdvertisementResponse &msg);
void bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
void bluetooth_gatt_read(const BluetoothGATTReadRequest &msg) override;
@@ -144,7 +125,8 @@ class APIConnection final : public APIServerConnection {
void bluetooth_gatt_write_descriptor(const BluetoothGATTWriteDescriptorRequest &msg) override;
void bluetooth_gatt_get_services(const BluetoothGATTGetServicesRequest &msg) override;
void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) override;
bool send_subscribe_bluetooth_connections_free_response(const SubscribeBluetoothConnectionsFreeRequest &msg) override;
BluetoothConnectionsFreeResponse subscribe_bluetooth_connections_free(
const SubscribeBluetoothConnectionsFreeRequest &msg) override;
void bluetooth_scanner_set_mode(const BluetoothScannerSetModeRequest &msg) override;
#endif
@@ -162,15 +144,11 @@ class APIConnection final : public APIServerConnection {
void on_voice_assistant_audio(const VoiceAssistantAudio &msg) override;
void on_voice_assistant_timer_event_response(const VoiceAssistantTimerEventResponse &msg) override;
void on_voice_assistant_announce_request(const VoiceAssistantAnnounceRequest &msg) override;
bool send_voice_assistant_get_configuration_response(const VoiceAssistantConfigurationRequest &msg) override;
VoiceAssistantConfigurationResponse voice_assistant_get_configuration(
const VoiceAssistantConfigurationRequest &msg) override;
void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
#endif
#ifdef USE_ZWAVE_PROXY
void zwave_proxy_frame(const ZWaveProxyFrame &msg) override;
void zwave_proxy_request(const ZWaveProxyRequest &msg) override;
#endif
#ifdef USE_ALARM_CONTROL_PANEL
bool send_alarm_control_panel_state(alarm_control_panel::AlarmControlPanel *a_alarm_control_panel);
void alarm_control_panel_command(const AlarmControlPanelCommandRequest &msg) override;
@@ -190,19 +168,15 @@ class APIConnection final : public APIServerConnection {
// we initiated ping
this->flags_.sent_ping = false;
}
#ifdef USE_API_HOMEASSISTANT_STATES
void on_home_assistant_state_response(const HomeAssistantStateResponse &msg) override;
#endif
#ifdef USE_HOMEASSISTANT_TIME
void on_get_time_response(const GetTimeResponse &value) override;
#endif
bool send_hello_response(const HelloRequest &msg) override;
#ifdef USE_API_PASSWORD
bool send_authenticate_response(const AuthenticationRequest &msg) override;
#endif
bool send_disconnect_response(const DisconnectRequest &msg) override;
bool send_ping_response(const PingRequest &msg) override;
bool send_device_info_response(const DeviceInfoRequest &msg) override;
HelloResponse hello(const HelloRequest &msg) override;
ConnectResponse connect(const ConnectRequest &msg) override;
DisconnectResponse disconnect(const DisconnectRequest &msg) override;
PingResponse ping(const PingRequest &msg) override { return {}; }
DeviceInfoResponse device_info(const DeviceInfoRequest &msg) override;
void list_entities(const ListEntitiesRequest &msg) override { this->list_entities_iterator_.begin(); }
void subscribe_states(const SubscribeStatesRequest &msg) override {
this->flags_.state_subscription = true;
@@ -213,19 +187,19 @@ class APIConnection final : public APIServerConnection {
if (msg.dump_config)
App.schedule_dump_config();
}
#ifdef USE_API_HOMEASSISTANT_SERVICES
void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) override {
this->flags_.service_call_subscription = true;
}
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) override;
#endif
GetTimeResponse get_time(const GetTimeRequest &msg) override {
// TODO
return {};
}
#ifdef USE_API_SERVICES
void execute_service(const ExecuteServiceRequest &msg) override;
#endif
#ifdef USE_API_NOISE
bool send_noise_encryption_set_key_response(const NoiseEncryptionSetKeyRequest &msg) override;
NoiseEncryptionSetKeyResponse noise_encryption_set_key(const NoiseEncryptionSetKeyRequest &msg) override;
#endif
bool is_authenticated() override {
@@ -236,54 +210,73 @@ class APIConnection final : public APIServerConnection {
this->is_authenticated();
}
uint8_t get_log_subscription_level() const { return this->flags_.log_subscription; }
// Get client API version for feature detection
bool client_supports_api_version(uint16_t major, uint16_t minor) const {
return this->client_api_version_major_ > major ||
(this->client_api_version_major_ == major && this->client_api_version_minor_ >= minor);
}
void on_fatal_error() override;
#ifdef USE_API_PASSWORD
void on_unauthenticated_access() override;
#endif
void on_no_setup_connection() override;
ProtoWriteBuffer create_buffer(uint32_t reserve_size) override {
// FIXME: ensure no recursive writes can happen
// Get header padding size - used for both reserve and insert
uint8_t header_padding = this->helper_->frame_header_padding();
// Get shared buffer from parent server
std::vector<uint8_t> &shared_buf = this->parent_->get_shared_buffer_ref();
this->prepare_first_message_buffer(shared_buf, header_padding,
reserve_size + header_padding + this->helper_->frame_footer_size());
return {&shared_buf};
}
void prepare_first_message_buffer(std::vector<uint8_t> &shared_buf, size_t header_padding, size_t total_size) {
shared_buf.clear();
// Reserve space for header padding + message + footer
// - Header padding: space for protocol headers (7 bytes for Noise, 6 for Plaintext)
// - Footer: space for MAC (16 bytes for Noise, 0 for Plaintext)
shared_buf.reserve(total_size);
shared_buf.reserve(reserve_size + header_padding + this->helper_->frame_footer_size());
// Resize to add header padding so message encoding starts at the correct position
shared_buf.resize(header_padding);
return {&shared_buf};
}
// Prepare buffer for next message in batch
ProtoWriteBuffer prepare_message_buffer(uint16_t message_size, bool is_first_message) {
// Get reference to shared buffer (it maintains state between batch messages)
std::vector<uint8_t> &shared_buf = this->parent_->get_shared_buffer_ref();
if (is_first_message) {
shared_buf.clear();
}
size_t current_size = shared_buf.size();
// Calculate padding to add:
// - First message: just header padding
// - Subsequent messages: footer for previous message + header padding for this message
size_t padding_to_add = is_first_message
? this->helper_->frame_header_padding()
: this->helper_->frame_header_padding() + this->helper_->frame_footer_size();
// Reserve space for padding + message
shared_buf.reserve(current_size + padding_to_add + message_size);
// Resize to add the padding bytes
shared_buf.resize(current_size + padding_to_add);
return {&shared_buf};
}
bool try_to_clear_buffer(bool log_out_of_space);
bool send_buffer(ProtoWriteBuffer buffer, uint8_t message_type) override;
const std::string &get_name() const { return this->client_info_.name; }
const std::string &get_peername() const { return this->client_info_.peername; }
std::string get_client_combined_info() const {
if (this->client_info_ == this->client_peername_) {
// Before Hello message, both are the same (just IP:port)
return this->client_info_;
}
return this->client_info_ + " (" + this->client_peername_ + ")";
}
// Buffer allocator methods for batch processing
ProtoWriteBuffer allocate_single_message_buffer(uint16_t size);
ProtoWriteBuffer allocate_batch_message_buffer(uint16_t size);
protected:
// Helper function to handle authentication completion
void complete_authentication_();
#ifdef USE_API_HOMEASSISTANT_STATES
void process_state_subscriptions_();
#endif
// Non-template helper to encode any ProtoMessage
static uint16_t encode_message_to_buffer(ProtoMessage &msg, uint8_t message_type, APIConnection *conn,
uint32_t remaining_size, bool is_single);
@@ -303,25 +296,14 @@ class APIConnection final : public APIServerConnection {
APIConnection *conn, uint32_t remaining_size, bool is_single) {
// Set common fields that are shared by all entity types
msg.key = entity->get_object_id_hash();
// Try to use static reference first to avoid allocation
StringRef static_ref = entity->get_object_id_ref_for_api_();
// Store dynamic string outside the if-else to maintain lifetime
std::string object_id;
if (!static_ref.empty()) {
msg.set_object_id(static_ref);
} else {
// Dynamic case - need to allocate
object_id = entity->get_object_id();
msg.set_object_id(StringRef(object_id));
}
msg.object_id = entity->get_object_id();
if (entity->has_own_name()) {
msg.set_name(entity->get_name());
}
if (entity->has_own_name())
msg.name = entity->get_name();
// Set common EntityBase properties
// Set common EntityBase properties
#ifdef USE_ENTITY_ICON
msg.set_icon(entity->get_icon_ref());
msg.icon = entity->get_icon();
#endif
msg.disabled_by_default = entity->is_disabled_by_default();
msg.entity_category = static_cast<enums::EntityCategory>(entity->get_entity_category());
@@ -491,14 +473,13 @@ class APIConnection final : public APIServerConnection {
std::unique_ptr<camera::CameraImageReader> image_reader_;
#endif
// Group 3: Client info struct (24 bytes on 32-bit: 2 strings × 12 bytes each)
ClientInfo client_info_;
// Group 3: Strings (12 bytes each on 32-bit, 4-byte aligned)
std::string client_info_;
std::string client_peername_;
// Group 4: 4-byte types
uint32_t last_traffic_;
#ifdef USE_API_HOMEASSISTANT_STATES
int state_subs_at_ = -1;
#endif
// Function pointer type for message encoding
using MessageCreatorPtr = uint16_t (*)(EntityBase *, APIConnection *, uint32_t remaining_size, bool is_single);
@@ -686,16 +667,10 @@ class APIConnection final : public APIServerConnection {
bool send_message_smart_(EntityBase *entity, MessageCreatorPtr creator, uint8_t message_type,
uint8_t estimated_size) {
// Try to send immediately if:
// 1. It's an UpdateStateResponse (always send immediately to handle cases where
// the main loop is blocked, e.g., during OTA updates)
// 2. OR: We should try to send immediately (should_try_send_immediately = true)
// AND Batch delay is 0 (user has opted in to immediate sending)
// 3. AND: Buffer has space available
if ((
#ifdef USE_UPDATE
message_type == UpdateStateResponse::MESSAGE_TYPE ||
#endif
(this->flags_.should_try_send_immediately && this->get_batch_delay_ms_() == 0)) &&
// 1. We should try to send immediately (should_try_send_immediately = true)
// 2. Batch delay is 0 (user has opted in to immediate sending)
// 3. Buffer has space available
if (this->flags_.should_try_send_immediately && this->get_batch_delay_ms_() == 0 &&
this->helper_->can_write_without_blocking()) {
// Now actually encode and send
if (creator(entity, this, MAX_BATCH_PACKET_SIZE, true) &&
@@ -732,15 +707,8 @@ class APIConnection final : public APIServerConnection {
this->deferred_batch_.add_item_front(entity, MessageCreator(function_ptr), message_type, estimated_size);
return this->schedule_batch_();
}
// Helper function to log API errors with errno
void log_warning_(const LogString *message, APIError err);
// Helper to handle fatal errors with logging
inline void fatal_error_with_log_(const LogString *message, APIError err) {
this->on_fatal_error();
this->log_warning_(message, err);
}
};
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

File diff suppressed because it is too large Load Diff

View File

@@ -1,36 +1,23 @@
#pragma once
#include <array>
#include <cstdint>
#include <deque>
#include <limits>
#include <memory>
#include <span>
#include <utility>
#include <vector>
#include "esphome/core/defines.h"
#ifdef USE_API
#include "esphome/components/socket/socket.h"
#include "esphome/core/application.h"
#include "esphome/core/log.h"
namespace esphome::api {
// uncomment to log raw packets
//#define HELPER_LOG_PACKETS
// Maximum message size limits to prevent OOM on constrained devices
// Handshake messages are limited to a small size for security
static constexpr uint16_t MAX_HANDSHAKE_SIZE = 128;
// Data message limits vary by platform based on available memory
#ifdef USE_ESP8266
static constexpr uint16_t MAX_MESSAGE_SIZE = 8192; // 8 KiB for ESP8266
#else
static constexpr uint16_t MAX_MESSAGE_SIZE = 32768; // 32 KiB for ESP32 and other platforms
#ifdef USE_API_NOISE
#include "noise/protocol.h"
#endif
// Forward declaration
struct ClientInfo;
#include "api_noise_context.h"
#include "esphome/components/socket/socket.h"
#include "esphome/core/application.h"
namespace esphome {
namespace api {
class ProtoWriteBuffer;
@@ -53,6 +40,7 @@ struct PacketInfo {
enum class APIError : uint16_t {
OK = 0,
WOULD_BLOCK = 1001,
BAD_HANDSHAKE_PACKET_LEN = 1002,
BAD_INDICATOR = 1003,
BAD_DATA_PACKET = 1004,
TCP_NODELAY_FAILED = 1005,
@@ -63,35 +51,31 @@ enum class APIError : uint16_t {
BAD_ARG = 1010,
SOCKET_READ_FAILED = 1011,
SOCKET_WRITE_FAILED = 1012,
OUT_OF_MEMORY = 1018,
CONNECTION_CLOSED = 1022,
#ifdef USE_API_NOISE
BAD_HANDSHAKE_PACKET_LEN = 1002,
HANDSHAKESTATE_READ_FAILED = 1013,
HANDSHAKESTATE_WRITE_FAILED = 1014,
HANDSHAKESTATE_BAD_STATE = 1015,
CIPHERSTATE_DECRYPT_FAILED = 1016,
CIPHERSTATE_ENCRYPT_FAILED = 1017,
OUT_OF_MEMORY = 1018,
HANDSHAKESTATE_SETUP_FAILED = 1019,
HANDSHAKESTATE_SPLIT_FAILED = 1020,
BAD_HANDSHAKE_ERROR_BYTE = 1021,
#endif
CONNECTION_CLOSED = 1022,
};
const LogString *api_error_to_logstr(APIError err);
const char *api_error_to_str(APIError err);
class APIFrameHelper {
public:
APIFrameHelper() = default;
explicit APIFrameHelper(std::unique_ptr<socket::Socket> socket, const ClientInfo *client_info)
: socket_owned_(std::move(socket)), client_info_(client_info) {
explicit APIFrameHelper(std::unique_ptr<socket::Socket> socket) : socket_owned_(std::move(socket)) {
socket_ = socket_owned_.get();
}
virtual ~APIFrameHelper() = default;
virtual APIError init() = 0;
virtual APIError loop();
virtual APIError read_packet(ReadPacketBuffer *buffer) = 0;
bool can_write_without_blocking() { return this->state_ == State::DATA && this->tx_buf_count_ == 0; }
bool can_write_without_blocking() { return state_ == State::DATA && tx_buf_.empty(); }
std::string getpeername() { return socket_->getpeername(); }
int getpeername(struct sockaddr *addr, socklen_t *addrlen) { return socket_->getpeername(addr, addrlen); }
APIError close() {
@@ -110,41 +94,44 @@ class APIFrameHelper {
}
return APIError::OK;
}
// Give this helper a name for logging
void set_log_info(std::string info) { info_ = std::move(info); }
virtual APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) = 0;
// Write multiple protobuf packets in a single operation
// packets contains (message_type, offset, length) for each message in the buffer
// The buffer contains all messages with appropriate padding before each
virtual APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) = 0;
// Get the frame header padding required by this protocol
uint8_t frame_header_padding() const { return frame_header_padding_; }
virtual uint8_t frame_header_padding() = 0;
// Get the frame footer size required by this protocol
uint8_t frame_footer_size() const { return frame_footer_size_; }
virtual uint8_t frame_footer_size() = 0;
// Check if socket has data ready to read
bool is_socket_ready() const { return socket_ != nullptr && socket_->ready(); }
protected:
// Struct for holding parsed frame data
struct ParsedFrame {
std::vector<uint8_t> msg;
};
// Buffer containing data to be sent
struct SendBuffer {
std::unique_ptr<uint8_t[]> data;
uint16_t size{0}; // Total size of the buffer
uint16_t offset{0}; // Current offset within the buffer
std::vector<uint8_t> data;
uint16_t offset{0}; // Current offset within the buffer (uint16_t to reduce memory usage)
// Using uint16_t reduces memory usage since ESPHome API messages are limited to UINT16_MAX (65535) bytes
uint16_t remaining() const { return size - offset; }
const uint8_t *current_data() const { return data.get() + offset; }
uint16_t remaining() const { return static_cast<uint16_t>(data.size()) - offset; }
const uint8_t *current_data() const { return data.data() + offset; }
};
// Common implementation for writing raw data to socket
APIError write_raw_(const struct iovec *iov, int iovcnt, uint16_t total_write_len);
APIError write_raw_(const struct iovec *iov, int iovcnt);
// Try to send data from the tx buffer
APIError try_send_tx_buf_();
// Helper method to buffer data from IOVs
void buffer_data_from_iov_(const struct iovec *iov, int iovcnt, uint16_t total_write_len, uint16_t offset);
// Common socket write error handling
APIError handle_socket_write_error_();
void buffer_data_from_iov_(const struct iovec *iov, int iovcnt, uint16_t total_write_len);
template<typename StateEnum>
APIError write_raw_(const struct iovec *iov, int iovcnt, socket::Socket *socket, std::vector<uint8_t> &tx_buf,
const std::string &info, StateEnum &state, StateEnum failed_state);
@@ -173,23 +160,17 @@ class APIFrameHelper {
};
// Containers (size varies, but typically 12+ bytes on 32-bit)
std::array<std::unique_ptr<SendBuffer>, API_MAX_SEND_QUEUE> tx_buf_;
std::deque<SendBuffer> tx_buf_;
std::string info_;
std::vector<struct iovec> reusable_iovs_;
std::vector<uint8_t> rx_buf_;
// Pointer to client info (4 bytes on 32-bit)
// Note: The pointed-to ClientInfo object must outlive this APIFrameHelper instance.
const ClientInfo *client_info_{nullptr};
// Group smaller types together
uint16_t rx_buf_len_ = 0;
State state_{State::INITIALIZE};
uint8_t frame_header_padding_{0};
uint8_t frame_footer_size_{0};
uint8_t tx_buf_head_{0};
uint8_t tx_buf_tail_{0};
uint8_t tx_buf_count_{0};
// 8 bytes total, 0 bytes padding
// 5 bytes total, 3 bytes padding
// Common initialization for both plaintext and noise protocols
APIError init_common_();
@@ -198,6 +179,105 @@ class APIFrameHelper {
APIError handle_socket_read_result_(ssize_t received);
};
} // namespace esphome::api
#ifdef USE_API_NOISE
class APINoiseFrameHelper : public APIFrameHelper {
public:
APINoiseFrameHelper(std::unique_ptr<socket::Socket> socket, std::shared_ptr<APINoiseContext> ctx)
: APIFrameHelper(std::move(socket)), ctx_(std::move(ctx)) {
// Noise header structure:
// Pos 0: indicator (0x01)
// Pos 1-2: encrypted payload size (16-bit big-endian)
// Pos 3-6: encrypted type (16-bit) + data_len (16-bit)
// Pos 7+: actual payload data
frame_header_padding_ = 7;
}
~APINoiseFrameHelper() override;
APIError init() override;
APIError loop() override;
APIError read_packet(ReadPacketBuffer *buffer) override;
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
// Get the frame header padding required by this protocol
uint8_t frame_header_padding() override { return frame_header_padding_; }
// Get the frame footer size required by this protocol
uint8_t frame_footer_size() override { return frame_footer_size_; }
#endif // USE_API
protected:
APIError state_action_();
APIError try_read_frame_(ParsedFrame *frame);
APIError write_frame_(const uint8_t *data, uint16_t len);
APIError init_handshake_();
APIError check_handshake_finished_();
void send_explicit_handshake_reject_(const std::string &reason);
// Pointers first (4 bytes each)
NoiseHandshakeState *handshake_{nullptr};
NoiseCipherState *send_cipher_{nullptr};
NoiseCipherState *recv_cipher_{nullptr};
// Shared pointer (8 bytes on 32-bit = 4 bytes control block pointer + 4 bytes object pointer)
std::shared_ptr<APINoiseContext> ctx_;
// Vector (12 bytes on 32-bit)
std::vector<uint8_t> prologue_;
// NoiseProtocolId (size depends on implementation)
NoiseProtocolId nid_;
// Group small types together
// Fixed-size header buffer for noise protocol:
// 1 byte for indicator + 2 bytes for message size (16-bit value, not varint)
// Note: Maximum message size is UINT16_MAX (65535), with a limit of 128 bytes during handshake phase
uint8_t rx_header_buf_[3];
uint8_t rx_header_buf_len_ = 0;
// 4 bytes total, no padding
};
#endif // USE_API_NOISE
#ifdef USE_API_PLAINTEXT
class APIPlaintextFrameHelper : public APIFrameHelper {
public:
APIPlaintextFrameHelper(std::unique_ptr<socket::Socket> socket) : APIFrameHelper(std::move(socket)) {
// Plaintext header structure (worst case):
// Pos 0: indicator (0x00)
// Pos 1-3: payload size varint (up to 3 bytes)
// Pos 4-5: message type varint (up to 2 bytes)
// Pos 6+: actual payload data
frame_header_padding_ = 6;
}
~APIPlaintextFrameHelper() override = default;
APIError init() override;
APIError loop() override;
APIError read_packet(ReadPacketBuffer *buffer) override;
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
uint8_t frame_header_padding() override { return frame_header_padding_; }
// Get the frame footer size required by this protocol
uint8_t frame_footer_size() override { return frame_footer_size_; }
protected:
APIError try_read_frame_(ParsedFrame *frame);
// Group 2-byte aligned types
uint16_t rx_header_parsed_type_ = 0;
uint16_t rx_header_parsed_len_ = 0;
// Group 1-byte types together
// Fixed-size header buffer for plaintext protocol:
// We now store the indicator byte + the two varints.
// To match noise protocol's maximum message size (UINT16_MAX = 65535), we need:
// 1 byte for indicator + 3 bytes for message size varint (supports up to 2097151) + 2 bytes for message type varint
//
// While varints could theoretically be up to 10 bytes each for 64-bit values,
// attempting to process messages with headers that large would likely crash the
// ESP32 due to memory constraints.
uint8_t rx_header_buf_[6]; // 1 byte indicator + 5 bytes for varints (3 for size + 2 for type)
uint8_t rx_header_buf_pos_ = 0;
bool rx_header_parsed_ = false;
// 8 bytes total, no padding needed
};
#endif
} // namespace api
} // namespace esphome
#endif

View File

@@ -1,606 +0,0 @@
#include "api_frame_helper_noise.h"
#ifdef USE_API
#ifdef USE_API_NOISE
#include "api_connection.h" // For ClientInfo struct
#include "esphome/core/application.h"
#include "esphome/core/hal.h"
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include "proto.h"
#include <cstring>
#include <cinttypes>
#ifdef USE_ESP8266
#include <pgmspace.h>
#endif
namespace esphome::api {
static const char *const TAG = "api.noise";
#ifdef USE_ESP8266
static const char PROLOGUE_INIT[] PROGMEM = "NoiseAPIInit";
#else
static const char *const PROLOGUE_INIT = "NoiseAPIInit";
#endif
static constexpr size_t PROLOGUE_INIT_LEN = 12; // strlen("NoiseAPIInit")
#define HELPER_LOG(msg, ...) \
ESP_LOGVV(TAG, "%s (%s): " msg, this->client_info_->name.c_str(), this->client_info_->peername.c_str(), ##__VA_ARGS__)
#ifdef HELPER_LOG_PACKETS
#define LOG_PACKET_RECEIVED(buffer) ESP_LOGVV(TAG, "Received frame: %s", format_hex_pretty(buffer).c_str())
#define LOG_PACKET_SENDING(data, len) ESP_LOGVV(TAG, "Sending raw: %s", format_hex_pretty(data, len).c_str())
#else
#define LOG_PACKET_RECEIVED(buffer) ((void) 0)
#define LOG_PACKET_SENDING(data, len) ((void) 0)
#endif
/// Convert a noise error code to a readable error
const LogString *noise_err_to_logstr(int err) {
if (err == NOISE_ERROR_NO_MEMORY)
return LOG_STR("NO_MEMORY");
if (err == NOISE_ERROR_UNKNOWN_ID)
return LOG_STR("UNKNOWN_ID");
if (err == NOISE_ERROR_UNKNOWN_NAME)
return LOG_STR("UNKNOWN_NAME");
if (err == NOISE_ERROR_MAC_FAILURE)
return LOG_STR("MAC_FAILURE");
if (err == NOISE_ERROR_NOT_APPLICABLE)
return LOG_STR("NOT_APPLICABLE");
if (err == NOISE_ERROR_SYSTEM)
return LOG_STR("SYSTEM");
if (err == NOISE_ERROR_REMOTE_KEY_REQUIRED)
return LOG_STR("REMOTE_KEY_REQUIRED");
if (err == NOISE_ERROR_LOCAL_KEY_REQUIRED)
return LOG_STR("LOCAL_KEY_REQUIRED");
if (err == NOISE_ERROR_PSK_REQUIRED)
return LOG_STR("PSK_REQUIRED");
if (err == NOISE_ERROR_INVALID_LENGTH)
return LOG_STR("INVALID_LENGTH");
if (err == NOISE_ERROR_INVALID_PARAM)
return LOG_STR("INVALID_PARAM");
if (err == NOISE_ERROR_INVALID_STATE)
return LOG_STR("INVALID_STATE");
if (err == NOISE_ERROR_INVALID_NONCE)
return LOG_STR("INVALID_NONCE");
if (err == NOISE_ERROR_INVALID_PRIVATE_KEY)
return LOG_STR("INVALID_PRIVATE_KEY");
if (err == NOISE_ERROR_INVALID_PUBLIC_KEY)
return LOG_STR("INVALID_PUBLIC_KEY");
if (err == NOISE_ERROR_INVALID_FORMAT)
return LOG_STR("INVALID_FORMAT");
if (err == NOISE_ERROR_INVALID_SIGNATURE)
return LOG_STR("INVALID_SIGNATURE");
return LOG_STR("UNKNOWN");
}
/// Initialize the frame helper, returns OK if successful.
APIError APINoiseFrameHelper::init() {
APIError err = init_common_();
if (err != APIError::OK) {
return err;
}
// init prologue
size_t old_size = prologue_.size();
prologue_.resize(old_size + PROLOGUE_INIT_LEN);
#ifdef USE_ESP8266
memcpy_P(prologue_.data() + old_size, PROLOGUE_INIT, PROLOGUE_INIT_LEN);
#else
std::memcpy(prologue_.data() + old_size, PROLOGUE_INIT, PROLOGUE_INIT_LEN);
#endif
state_ = State::CLIENT_HELLO;
return APIError::OK;
}
// Helper for handling handshake frame errors
APIError APINoiseFrameHelper::handle_handshake_frame_error_(APIError aerr) {
if (aerr == APIError::BAD_INDICATOR) {
send_explicit_handshake_reject_(LOG_STR("Bad indicator byte"));
} else if (aerr == APIError::BAD_HANDSHAKE_PACKET_LEN) {
send_explicit_handshake_reject_(LOG_STR("Bad handshake packet len"));
}
return aerr;
}
// Helper for handling noise library errors
APIError APINoiseFrameHelper::handle_noise_error_(int err, const LogString *func_name, APIError api_err) {
if (err != 0) {
state_ = State::FAILED;
HELPER_LOG("%s failed: %s", LOG_STR_ARG(func_name), LOG_STR_ARG(noise_err_to_logstr(err)));
return api_err;
}
return APIError::OK;
}
/// Run through handshake messages (if in that phase)
APIError APINoiseFrameHelper::loop() {
// During handshake phase, process as many actions as possible until we can't progress
// socket_->ready() stays true until next main loop, but state_action() will return
// WOULD_BLOCK when no more data is available to read
while (state_ != State::DATA && this->socket_->ready()) {
APIError err = state_action_();
if (err == APIError::WOULD_BLOCK) {
break;
}
if (err != APIError::OK) {
return err;
}
}
// Use base class implementation for buffer sending
return APIFrameHelper::loop();
}
/** Read a packet into the rx_buf_.
*
* @return APIError::OK if a full packet is in rx_buf_
*
* errno EWOULDBLOCK: Packet could not be read without blocking. Try again later.
* errno ENOMEM: Not enough memory for reading packet.
* errno API_ERROR_BAD_INDICATOR: Bad indicator byte at start of frame.
* errno API_ERROR_HANDSHAKE_PACKET_LEN: Packet too big for this phase.
*/
APIError APINoiseFrameHelper::try_read_frame_() {
// read header
if (rx_header_buf_len_ < 3) {
// no header information yet
uint8_t to_read = 3 - rx_header_buf_len_;
ssize_t received = this->socket_->read(&rx_header_buf_[rx_header_buf_len_], to_read);
APIError err = handle_socket_read_result_(received);
if (err != APIError::OK) {
return err;
}
rx_header_buf_len_ += static_cast<uint8_t>(received);
if (static_cast<uint8_t>(received) != to_read) {
// not a full read
return APIError::WOULD_BLOCK;
}
if (rx_header_buf_[0] != 0x01) {
state_ = State::FAILED;
HELPER_LOG("Bad indicator byte %u", rx_header_buf_[0]);
return APIError::BAD_INDICATOR;
}
// header reading done
}
// read body
uint16_t msg_size = (((uint16_t) rx_header_buf_[1]) << 8) | rx_header_buf_[2];
// Check against size limits to prevent OOM: MAX_HANDSHAKE_SIZE for handshake, MAX_MESSAGE_SIZE for data
uint16_t limit = (state_ == State::DATA) ? MAX_MESSAGE_SIZE : MAX_HANDSHAKE_SIZE;
if (msg_size > limit) {
state_ = State::FAILED;
HELPER_LOG("Bad packet: message size %u exceeds maximum %u", msg_size, limit);
return (state_ == State::DATA) ? APIError::BAD_DATA_PACKET : APIError::BAD_HANDSHAKE_PACKET_LEN;
}
// Reserve space for body
if (this->rx_buf_.size() != msg_size) {
this->rx_buf_.resize(msg_size);
}
if (rx_buf_len_ < msg_size) {
// more data to read
uint16_t to_read = msg_size - rx_buf_len_;
ssize_t received = this->socket_->read(&rx_buf_[rx_buf_len_], to_read);
APIError err = handle_socket_read_result_(received);
if (err != APIError::OK) {
return err;
}
rx_buf_len_ += static_cast<uint16_t>(received);
if (static_cast<uint16_t>(received) != to_read) {
// not all read
return APIError::WOULD_BLOCK;
}
}
LOG_PACKET_RECEIVED(this->rx_buf_);
// Clear state for next frame (rx_buf_ still contains data for caller)
this->rx_buf_len_ = 0;
this->rx_header_buf_len_ = 0;
return APIError::OK;
}
/** To be called from read/write methods.
*
* This method runs through the internal handshake methods, if in that state.
*
* If the handshake is still active when this method returns and a read/write can't take place at
* the moment, returns WOULD_BLOCK.
* If an error occurred, returns that error. Only returns OK if the transport is ready for data
* traffic.
*/
APIError APINoiseFrameHelper::state_action_() {
int err;
APIError aerr;
if (state_ == State::INITIALIZE) {
HELPER_LOG("Bad state for method: %d", (int) state_);
return APIError::BAD_STATE;
}
if (state_ == State::CLIENT_HELLO) {
// waiting for client hello
aerr = this->try_read_frame_();
if (aerr != APIError::OK) {
return handle_handshake_frame_error_(aerr);
}
// ignore contents, may be used in future for flags
// Resize for: existing prologue + 2 size bytes + frame data
size_t old_size = this->prologue_.size();
this->prologue_.resize(old_size + 2 + this->rx_buf_.size());
this->prologue_[old_size] = (uint8_t) (this->rx_buf_.size() >> 8);
this->prologue_[old_size + 1] = (uint8_t) this->rx_buf_.size();
std::memcpy(this->prologue_.data() + old_size + 2, this->rx_buf_.data(), this->rx_buf_.size());
state_ = State::SERVER_HELLO;
}
if (state_ == State::SERVER_HELLO) {
// send server hello
const std::string &name = App.get_name();
const std::string &mac = get_mac_address();
std::vector<uint8_t> msg;
// Calculate positions and sizes
size_t name_len = name.size() + 1; // including null terminator
size_t mac_len = mac.size() + 1; // including null terminator
size_t name_offset = 1;
size_t mac_offset = name_offset + name_len;
size_t total_size = 1 + name_len + mac_len;
msg.resize(total_size);
// chosen proto
msg[0] = 0x01;
// node name, terminated by null byte
std::memcpy(msg.data() + name_offset, name.c_str(), name_len);
// node mac, terminated by null byte
std::memcpy(msg.data() + mac_offset, mac.c_str(), mac_len);
aerr = write_frame_(msg.data(), msg.size());
if (aerr != APIError::OK)
return aerr;
// start handshake
aerr = init_handshake_();
if (aerr != APIError::OK)
return aerr;
state_ = State::HANDSHAKE;
}
if (state_ == State::HANDSHAKE) {
int action = noise_handshakestate_get_action(handshake_);
if (action == NOISE_ACTION_READ_MESSAGE) {
// waiting for handshake msg
aerr = this->try_read_frame_();
if (aerr != APIError::OK) {
return handle_handshake_frame_error_(aerr);
}
if (this->rx_buf_.empty()) {
send_explicit_handshake_reject_(LOG_STR("Empty handshake message"));
return APIError::BAD_HANDSHAKE_ERROR_BYTE;
} else if (this->rx_buf_[0] != 0x00) {
HELPER_LOG("Bad handshake error byte: %u", this->rx_buf_[0]);
send_explicit_handshake_reject_(LOG_STR("Bad handshake error byte"));
return APIError::BAD_HANDSHAKE_ERROR_BYTE;
}
NoiseBuffer mbuf;
noise_buffer_init(mbuf);
noise_buffer_set_input(mbuf, this->rx_buf_.data() + 1, this->rx_buf_.size() - 1);
err = noise_handshakestate_read_message(handshake_, &mbuf, nullptr);
if (err != 0) {
// Special handling for MAC failure
send_explicit_handshake_reject_(err == NOISE_ERROR_MAC_FAILURE ? LOG_STR("Handshake MAC failure")
: LOG_STR("Handshake error"));
return handle_noise_error_(err, LOG_STR("noise_handshakestate_read_message"),
APIError::HANDSHAKESTATE_READ_FAILED);
}
aerr = check_handshake_finished_();
if (aerr != APIError::OK)
return aerr;
} else if (action == NOISE_ACTION_WRITE_MESSAGE) {
uint8_t buffer[65];
NoiseBuffer mbuf;
noise_buffer_init(mbuf);
noise_buffer_set_output(mbuf, buffer + 1, sizeof(buffer) - 1);
err = noise_handshakestate_write_message(handshake_, &mbuf, nullptr);
APIError aerr_write = handle_noise_error_(err, LOG_STR("noise_handshakestate_write_message"),
APIError::HANDSHAKESTATE_WRITE_FAILED);
if (aerr_write != APIError::OK)
return aerr_write;
buffer[0] = 0x00; // success
aerr = write_frame_(buffer, mbuf.size + 1);
if (aerr != APIError::OK)
return aerr;
aerr = check_handshake_finished_();
if (aerr != APIError::OK)
return aerr;
} else {
// bad state for action
state_ = State::FAILED;
HELPER_LOG("Bad action for handshake: %d", action);
return APIError::HANDSHAKESTATE_BAD_STATE;
}
}
if (state_ == State::CLOSED || state_ == State::FAILED) {
return APIError::BAD_STATE;
}
return APIError::OK;
}
void APINoiseFrameHelper::send_explicit_handshake_reject_(const LogString *reason) {
#ifdef USE_STORE_LOG_STR_IN_FLASH
// On ESP8266 with flash strings, we need to use PROGMEM-aware functions
size_t reason_len = strlen_P(reinterpret_cast<PGM_P>(reason));
std::vector<uint8_t> data;
data.resize(reason_len + 1);
data[0] = 0x01; // failure
// Copy error message from PROGMEM
if (reason_len > 0) {
memcpy_P(data.data() + 1, reinterpret_cast<PGM_P>(reason), reason_len);
}
#else
// Normal memory access
const char *reason_str = LOG_STR_ARG(reason);
size_t reason_len = strlen(reason_str);
std::vector<uint8_t> data;
data.resize(reason_len + 1);
data[0] = 0x01; // failure
// Copy error message in bulk
if (reason_len > 0) {
std::memcpy(data.data() + 1, reason_str, reason_len);
}
#endif
// temporarily remove failed state
auto orig_state = state_;
state_ = State::EXPLICIT_REJECT;
write_frame_(data.data(), data.size());
state_ = orig_state;
}
APIError APINoiseFrameHelper::read_packet(ReadPacketBuffer *buffer) {
APIError aerr = this->state_action_();
if (aerr != APIError::OK) {
return aerr;
}
if (this->state_ != State::DATA) {
return APIError::WOULD_BLOCK;
}
aerr = this->try_read_frame_();
if (aerr != APIError::OK)
return aerr;
NoiseBuffer mbuf;
noise_buffer_init(mbuf);
noise_buffer_set_inout(mbuf, this->rx_buf_.data(), this->rx_buf_.size(), this->rx_buf_.size());
int err = noise_cipherstate_decrypt(this->recv_cipher_, &mbuf);
APIError decrypt_err =
handle_noise_error_(err, LOG_STR("noise_cipherstate_decrypt"), APIError::CIPHERSTATE_DECRYPT_FAILED);
if (decrypt_err != APIError::OK) {
return decrypt_err;
}
uint16_t msg_size = mbuf.size;
uint8_t *msg_data = this->rx_buf_.data();
if (msg_size < 4) {
this->state_ = State::FAILED;
HELPER_LOG("Bad data packet: size %d too short", msg_size);
return APIError::BAD_DATA_PACKET;
}
uint16_t type = (((uint16_t) msg_data[0]) << 8) | msg_data[1];
uint16_t data_len = (((uint16_t) msg_data[2]) << 8) | msg_data[3];
if (data_len > msg_size - 4) {
this->state_ = State::FAILED;
HELPER_LOG("Bad data packet: data_len %u greater than msg_size %u", data_len, msg_size);
return APIError::BAD_DATA_PACKET;
}
buffer->container = std::move(this->rx_buf_);
buffer->data_offset = 4;
buffer->data_len = data_len;
buffer->type = type;
return APIError::OK;
}
APIError APINoiseFrameHelper::write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) {
// Resize to include MAC space (required for Noise encryption)
buffer.get_buffer()->resize(buffer.get_buffer()->size() + frame_footer_size_);
PacketInfo packet{type, 0,
static_cast<uint16_t>(buffer.get_buffer()->size() - frame_header_padding_ - frame_footer_size_)};
return write_protobuf_packets(buffer, std::span<const PacketInfo>(&packet, 1));
}
APIError APINoiseFrameHelper::write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) {
APIError aerr = state_action_();
if (aerr != APIError::OK) {
return aerr;
}
if (state_ != State::DATA) {
return APIError::WOULD_BLOCK;
}
if (packets.empty()) {
return APIError::OK;
}
std::vector<uint8_t> *raw_buffer = buffer.get_buffer();
uint8_t *buffer_data = raw_buffer->data(); // Cache buffer pointer
this->reusable_iovs_.clear();
this->reusable_iovs_.reserve(packets.size());
uint16_t total_write_len = 0;
// We need to encrypt each packet in place
for (const auto &packet : packets) {
// The buffer already has padding at offset
uint8_t *buf_start = buffer_data + packet.offset;
// Write noise header
buf_start[0] = 0x01; // indicator
// buf_start[1], buf_start[2] to be set after encryption
// Write message header (to be encrypted)
const uint8_t msg_offset = 3;
buf_start[msg_offset] = static_cast<uint8_t>(packet.message_type >> 8); // type high byte
buf_start[msg_offset + 1] = static_cast<uint8_t>(packet.message_type); // type low byte
buf_start[msg_offset + 2] = static_cast<uint8_t>(packet.payload_size >> 8); // data_len high byte
buf_start[msg_offset + 3] = static_cast<uint8_t>(packet.payload_size); // data_len low byte
// payload data is already in the buffer starting at offset + 7
// Make sure we have space for MAC
// The buffer should already have been sized appropriately
// Encrypt the message in place
NoiseBuffer mbuf;
noise_buffer_init(mbuf);
noise_buffer_set_inout(mbuf, buf_start + msg_offset, 4 + packet.payload_size,
4 + packet.payload_size + frame_footer_size_);
int err = noise_cipherstate_encrypt(send_cipher_, &mbuf);
APIError aerr =
handle_noise_error_(err, LOG_STR("noise_cipherstate_encrypt"), APIError::CIPHERSTATE_ENCRYPT_FAILED);
if (aerr != APIError::OK)
return aerr;
// Fill in the encrypted size
buf_start[1] = static_cast<uint8_t>(mbuf.size >> 8);
buf_start[2] = static_cast<uint8_t>(mbuf.size);
// Add iovec for this encrypted packet
size_t packet_len = static_cast<size_t>(3 + mbuf.size); // indicator + size + encrypted data
this->reusable_iovs_.push_back({buf_start, packet_len});
total_write_len += packet_len;
}
// Send all encrypted packets in one writev call
return this->write_raw_(this->reusable_iovs_.data(), this->reusable_iovs_.size(), total_write_len);
}
APIError APINoiseFrameHelper::write_frame_(const uint8_t *data, uint16_t len) {
uint8_t header[3];
header[0] = 0x01; // indicator
header[1] = (uint8_t) (len >> 8);
header[2] = (uint8_t) len;
struct iovec iov[2];
iov[0].iov_base = header;
iov[0].iov_len = 3;
if (len == 0) {
return this->write_raw_(iov, 1, 3); // Just header
}
iov[1].iov_base = const_cast<uint8_t *>(data);
iov[1].iov_len = len;
return this->write_raw_(iov, 2, 3 + len); // Header + data
}
/** Initiate the data structures for the handshake.
*
* @return 0 on success, -1 on error (check errno)
*/
APIError APINoiseFrameHelper::init_handshake_() {
int err;
memset(&nid_, 0, sizeof(nid_));
// const char *proto = "Noise_NNpsk0_25519_ChaChaPoly_SHA256";
// err = noise_protocol_name_to_id(&nid_, proto, strlen(proto));
nid_.pattern_id = NOISE_PATTERN_NN;
nid_.cipher_id = NOISE_CIPHER_CHACHAPOLY;
nid_.dh_id = NOISE_DH_CURVE25519;
nid_.prefix_id = NOISE_PREFIX_STANDARD;
nid_.hybrid_id = NOISE_DH_NONE;
nid_.hash_id = NOISE_HASH_SHA256;
nid_.modifier_ids[0] = NOISE_MODIFIER_PSK0;
err = noise_handshakestate_new_by_id(&handshake_, &nid_, NOISE_ROLE_RESPONDER);
APIError aerr =
handle_noise_error_(err, LOG_STR("noise_handshakestate_new_by_id"), APIError::HANDSHAKESTATE_SETUP_FAILED);
if (aerr != APIError::OK)
return aerr;
const auto &psk = ctx_->get_psk();
err = noise_handshakestate_set_pre_shared_key(handshake_, psk.data(), psk.size());
aerr = handle_noise_error_(err, LOG_STR("noise_handshakestate_set_pre_shared_key"),
APIError::HANDSHAKESTATE_SETUP_FAILED);
if (aerr != APIError::OK)
return aerr;
err = noise_handshakestate_set_prologue(handshake_, prologue_.data(), prologue_.size());
aerr = handle_noise_error_(err, LOG_STR("noise_handshakestate_set_prologue"), APIError::HANDSHAKESTATE_SETUP_FAILED);
if (aerr != APIError::OK)
return aerr;
// set_prologue copies it into handshakestate, so we can get rid of it now
prologue_ = {};
err = noise_handshakestate_start(handshake_);
aerr = handle_noise_error_(err, LOG_STR("noise_handshakestate_start"), APIError::HANDSHAKESTATE_SETUP_FAILED);
if (aerr != APIError::OK)
return aerr;
return APIError::OK;
}
APIError APINoiseFrameHelper::check_handshake_finished_() {
assert(state_ == State::HANDSHAKE);
int action = noise_handshakestate_get_action(handshake_);
if (action == NOISE_ACTION_READ_MESSAGE || action == NOISE_ACTION_WRITE_MESSAGE)
return APIError::OK;
if (action != NOISE_ACTION_SPLIT) {
state_ = State::FAILED;
HELPER_LOG("Bad action for handshake: %d", action);
return APIError::HANDSHAKESTATE_BAD_STATE;
}
int err = noise_handshakestate_split(handshake_, &send_cipher_, &recv_cipher_);
APIError aerr =
handle_noise_error_(err, LOG_STR("noise_handshakestate_split"), APIError::HANDSHAKESTATE_SPLIT_FAILED);
if (aerr != APIError::OK)
return aerr;
frame_footer_size_ = noise_cipherstate_get_mac_length(send_cipher_);
HELPER_LOG("Handshake complete!");
noise_handshakestate_free(handshake_);
handshake_ = nullptr;
state_ = State::DATA;
return APIError::OK;
}
APINoiseFrameHelper::~APINoiseFrameHelper() {
if (handshake_ != nullptr) {
noise_handshakestate_free(handshake_);
handshake_ = nullptr;
}
if (send_cipher_ != nullptr) {
noise_cipherstate_free(send_cipher_);
send_cipher_ = nullptr;
}
if (recv_cipher_ != nullptr) {
noise_cipherstate_free(recv_cipher_);
recv_cipher_ = nullptr;
}
}
extern "C" {
// declare how noise generates random bytes (here with a good HWRNG based on the RF system)
void noise_rand_bytes(void *output, size_t len) {
if (!esphome::random_bytes(reinterpret_cast<uint8_t *>(output), len)) {
ESP_LOGE(TAG, "Acquiring random bytes failed; rebooting");
arch_restart();
}
}
}
} // namespace esphome::api
#endif // USE_API_NOISE
#endif // USE_API

View File

@@ -1,64 +0,0 @@
#pragma once
#include "api_frame_helper.h"
#ifdef USE_API
#ifdef USE_API_NOISE
#include "noise/protocol.h"
#include "api_noise_context.h"
namespace esphome::api {
class APINoiseFrameHelper final : public APIFrameHelper {
public:
APINoiseFrameHelper(std::unique_ptr<socket::Socket> socket, std::shared_ptr<APINoiseContext> ctx,
const ClientInfo *client_info)
: APIFrameHelper(std::move(socket), client_info), ctx_(std::move(ctx)) {
// Noise header structure:
// Pos 0: indicator (0x01)
// Pos 1-2: encrypted payload size (16-bit big-endian)
// Pos 3-6: encrypted type (16-bit) + data_len (16-bit)
// Pos 7+: actual payload data
frame_header_padding_ = 7;
}
~APINoiseFrameHelper() override;
APIError init() override;
APIError loop() override;
APIError read_packet(ReadPacketBuffer *buffer) override;
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
protected:
APIError state_action_();
APIError try_read_frame_();
APIError write_frame_(const uint8_t *data, uint16_t len);
APIError init_handshake_();
APIError check_handshake_finished_();
void send_explicit_handshake_reject_(const LogString *reason);
APIError handle_handshake_frame_error_(APIError aerr);
APIError handle_noise_error_(int err, const LogString *func_name, APIError api_err);
// Pointers first (4 bytes each)
NoiseHandshakeState *handshake_{nullptr};
NoiseCipherState *send_cipher_{nullptr};
NoiseCipherState *recv_cipher_{nullptr};
// Shared pointer (8 bytes on 32-bit = 4 bytes control block pointer + 4 bytes object pointer)
std::shared_ptr<APINoiseContext> ctx_;
// Vector (12 bytes on 32-bit)
std::vector<uint8_t> prologue_;
// NoiseProtocolId (size depends on implementation)
NoiseProtocolId nid_;
// Group small types together
// Fixed-size header buffer for noise protocol:
// 1 byte for indicator + 2 bytes for message size (16-bit value, not varint)
// Note: Maximum message size is UINT16_MAX (65535), with a limit of 128 bytes during handshake phase
uint8_t rx_header_buf_[3];
uint8_t rx_header_buf_len_ = 0;
// 4 bytes total, no padding
};
} // namespace esphome::api
#endif // USE_API_NOISE
#endif // USE_API

View File

@@ -1,294 +0,0 @@
#include "api_frame_helper_plaintext.h"
#ifdef USE_API
#ifdef USE_API_PLAINTEXT
#include "api_connection.h" // For ClientInfo struct
#include "esphome/core/application.h"
#include "esphome/core/hal.h"
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include "proto.h"
#include <cstring>
#include <cinttypes>
#ifdef USE_ESP8266
#include <pgmspace.h>
#endif
namespace esphome::api {
static const char *const TAG = "api.plaintext";
#define HELPER_LOG(msg, ...) \
ESP_LOGVV(TAG, "%s (%s): " msg, this->client_info_->name.c_str(), this->client_info_->peername.c_str(), ##__VA_ARGS__)
#ifdef HELPER_LOG_PACKETS
#define LOG_PACKET_RECEIVED(buffer) ESP_LOGVV(TAG, "Received frame: %s", format_hex_pretty(buffer).c_str())
#define LOG_PACKET_SENDING(data, len) ESP_LOGVV(TAG, "Sending raw: %s", format_hex_pretty(data, len).c_str())
#else
#define LOG_PACKET_RECEIVED(buffer) ((void) 0)
#define LOG_PACKET_SENDING(data, len) ((void) 0)
#endif
/// Initialize the frame helper, returns OK if successful.
APIError APIPlaintextFrameHelper::init() {
APIError err = init_common_();
if (err != APIError::OK) {
return err;
}
state_ = State::DATA;
return APIError::OK;
}
APIError APIPlaintextFrameHelper::loop() {
if (state_ != State::DATA) {
return APIError::BAD_STATE;
}
// Use base class implementation for buffer sending
return APIFrameHelper::loop();
}
/** Read a packet into the rx_buf_.
*
* @return See APIError
*
* error API_ERROR_BAD_INDICATOR: Bad indicator byte at start of frame.
*/
APIError APIPlaintextFrameHelper::try_read_frame_() {
// read header
while (!rx_header_parsed_) {
// Now that we know when the socket is ready, we can read up to 3 bytes
// into the rx_header_buf_ before we have to switch back to reading
// one byte at a time to ensure we don't read past the message and
// into the next one.
// Read directly into rx_header_buf_ at the current position
// Try to get to at least 3 bytes total (indicator + 2 varint bytes), then read one byte at a time
ssize_t received =
this->socket_->read(&rx_header_buf_[rx_header_buf_pos_], rx_header_buf_pos_ < 3 ? 3 - rx_header_buf_pos_ : 1);
APIError err = handle_socket_read_result_(received);
if (err != APIError::OK) {
return err;
}
// If this was the first read, validate the indicator byte
if (rx_header_buf_pos_ == 0 && received > 0) {
if (rx_header_buf_[0] != 0x00) {
state_ = State::FAILED;
HELPER_LOG("Bad indicator byte %u", rx_header_buf_[0]);
return APIError::BAD_INDICATOR;
}
}
rx_header_buf_pos_ += received;
// Check for buffer overflow
if (rx_header_buf_pos_ >= sizeof(rx_header_buf_)) {
state_ = State::FAILED;
HELPER_LOG("Header buffer overflow");
return APIError::BAD_DATA_PACKET;
}
// Need at least 3 bytes total (indicator + 2 varint bytes) before trying to parse
if (rx_header_buf_pos_ < 3) {
continue;
}
// At this point, we have at least 3 bytes total:
// - Validated indicator byte (0x00) stored at position 0
// - At least 2 bytes in the buffer for the varints
// Buffer layout:
// [0]: indicator byte (0x00)
// [1-3]: Message size varint (variable length)
// - 2 bytes would only allow up to 16383, which is less than noise's UINT16_MAX (65535)
// - 3 bytes allows up to 2097151, ensuring we support at least as much as noise
// [2-5]: Message type varint (variable length)
// We now attempt to parse both varints. If either is incomplete,
// we'll continue reading more bytes.
// Skip indicator byte at position 0
uint8_t varint_pos = 1;
uint32_t consumed = 0;
auto msg_size_varint = ProtoVarInt::parse(&rx_header_buf_[varint_pos], rx_header_buf_pos_ - varint_pos, &consumed);
if (!msg_size_varint.has_value()) {
// not enough data there yet
continue;
}
if (msg_size_varint->as_uint32() > MAX_MESSAGE_SIZE) {
state_ = State::FAILED;
HELPER_LOG("Bad packet: message size %" PRIu32 " exceeds maximum %u", msg_size_varint->as_uint32(),
MAX_MESSAGE_SIZE);
return APIError::BAD_DATA_PACKET;
}
rx_header_parsed_len_ = msg_size_varint->as_uint16();
// Move to next varint position
varint_pos += consumed;
auto msg_type_varint = ProtoVarInt::parse(&rx_header_buf_[varint_pos], rx_header_buf_pos_ - varint_pos, &consumed);
if (!msg_type_varint.has_value()) {
// not enough data there yet
continue;
}
if (msg_type_varint->as_uint32() > std::numeric_limits<uint16_t>::max()) {
state_ = State::FAILED;
HELPER_LOG("Bad packet: message type %" PRIu32 " exceeds maximum %u", msg_type_varint->as_uint32(),
std::numeric_limits<uint16_t>::max());
return APIError::BAD_DATA_PACKET;
}
rx_header_parsed_type_ = msg_type_varint->as_uint16();
rx_header_parsed_ = true;
}
// header reading done
// Reserve space for body
if (this->rx_buf_.size() != this->rx_header_parsed_len_) {
this->rx_buf_.resize(this->rx_header_parsed_len_);
}
if (rx_buf_len_ < rx_header_parsed_len_) {
// more data to read
uint16_t to_read = rx_header_parsed_len_ - rx_buf_len_;
ssize_t received = this->socket_->read(&rx_buf_[rx_buf_len_], to_read);
APIError err = handle_socket_read_result_(received);
if (err != APIError::OK) {
return err;
}
rx_buf_len_ += static_cast<uint16_t>(received);
if (static_cast<uint16_t>(received) != to_read) {
// not all read
return APIError::WOULD_BLOCK;
}
}
LOG_PACKET_RECEIVED(this->rx_buf_);
// Clear state for next frame (rx_buf_ still contains data for caller)
this->rx_buf_len_ = 0;
this->rx_header_buf_pos_ = 0;
this->rx_header_parsed_ = false;
return APIError::OK;
}
APIError APIPlaintextFrameHelper::read_packet(ReadPacketBuffer *buffer) {
if (this->state_ != State::DATA) {
return APIError::WOULD_BLOCK;
}
APIError aerr = this->try_read_frame_();
if (aerr != APIError::OK) {
if (aerr == APIError::BAD_INDICATOR) {
// Make sure to tell the remote that we don't
// understand the indicator byte so it knows
// we do not support it.
struct iovec iov[1];
// The \x00 first byte is the marker for plaintext.
//
// The remote will know how to handle the indicator byte,
// but it likely won't understand the rest of the message.
//
// We must send at least 3 bytes to be read, so we add
// a message after the indicator byte to ensures its long
// enough and can aid in debugging.
static constexpr uint8_t INDICATOR_MSG_SIZE = 19;
#ifdef USE_ESP8266
static const char MSG_PROGMEM[] PROGMEM = "\x00"
"Bad indicator byte";
char msg[INDICATOR_MSG_SIZE];
memcpy_P(msg, MSG_PROGMEM, INDICATOR_MSG_SIZE);
iov[0].iov_base = (void *) msg;
#else
static const char MSG[] = "\x00"
"Bad indicator byte";
iov[0].iov_base = (void *) MSG;
#endif
iov[0].iov_len = INDICATOR_MSG_SIZE;
this->write_raw_(iov, 1, INDICATOR_MSG_SIZE);
}
return aerr;
}
buffer->container = std::move(this->rx_buf_);
buffer->data_offset = 0;
buffer->data_len = this->rx_header_parsed_len_;
buffer->type = this->rx_header_parsed_type_;
return APIError::OK;
}
APIError APIPlaintextFrameHelper::write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) {
PacketInfo packet{type, 0, static_cast<uint16_t>(buffer.get_buffer()->size() - frame_header_padding_)};
return write_protobuf_packets(buffer, std::span<const PacketInfo>(&packet, 1));
}
APIError APIPlaintextFrameHelper::write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) {
if (state_ != State::DATA) {
return APIError::BAD_STATE;
}
if (packets.empty()) {
return APIError::OK;
}
std::vector<uint8_t> *raw_buffer = buffer.get_buffer();
uint8_t *buffer_data = raw_buffer->data(); // Cache buffer pointer
this->reusable_iovs_.clear();
this->reusable_iovs_.reserve(packets.size());
uint16_t total_write_len = 0;
for (const auto &packet : packets) {
// Calculate varint sizes for header layout
uint8_t size_varint_len = api::ProtoSize::varint(static_cast<uint32_t>(packet.payload_size));
uint8_t type_varint_len = api::ProtoSize::varint(static_cast<uint32_t>(packet.message_type));
uint8_t total_header_len = 1 + size_varint_len + type_varint_len;
// Calculate where to start writing the header
// The header starts at the latest possible position to minimize unused padding
//
// Example 1 (small values): total_header_len = 3, header_offset = 6 - 3 = 3
// [0-2] - Unused padding
// [3] - 0x00 indicator byte
// [4] - Payload size varint (1 byte, for sizes 0-127)
// [5] - Message type varint (1 byte, for types 0-127)
// [6...] - Actual payload data
//
// Example 2 (medium values): total_header_len = 4, header_offset = 6 - 4 = 2
// [0-1] - Unused padding
// [2] - 0x00 indicator byte
// [3-4] - Payload size varint (2 bytes, for sizes 128-16383)
// [5] - Message type varint (1 byte, for types 0-127)
// [6...] - Actual payload data
//
// Example 3 (large values): total_header_len = 6, header_offset = 6 - 6 = 0
// [0] - 0x00 indicator byte
// [1-3] - Payload size varint (3 bytes, for sizes 16384-2097151)
// [4-5] - Message type varint (2 bytes, for types 128-32767)
// [6...] - Actual payload data
//
// The message starts at offset + frame_header_padding_
// So we write the header starting at offset + frame_header_padding_ - total_header_len
uint8_t *buf_start = buffer_data + packet.offset;
uint32_t header_offset = frame_header_padding_ - total_header_len;
// Write the plaintext header
buf_start[header_offset] = 0x00; // indicator
// Encode varints directly into buffer
ProtoVarInt(packet.payload_size).encode_to_buffer_unchecked(buf_start + header_offset + 1, size_varint_len);
ProtoVarInt(packet.message_type)
.encode_to_buffer_unchecked(buf_start + header_offset + 1 + size_varint_len, type_varint_len);
// Add iovec for this packet (header + payload)
size_t packet_len = static_cast<size_t>(total_header_len + packet.payload_size);
this->reusable_iovs_.push_back({buf_start + header_offset, packet_len});
total_write_len += packet_len;
}
// Send all packets in one writev call
return write_raw_(this->reusable_iovs_.data(), this->reusable_iovs_.size(), total_write_len);
}
} // namespace esphome::api
#endif // USE_API_PLAINTEXT
#endif // USE_API

View File

@@ -1,50 +0,0 @@
#pragma once
#include "api_frame_helper.h"
#ifdef USE_API
#ifdef USE_API_PLAINTEXT
namespace esphome::api {
class APIPlaintextFrameHelper final : public APIFrameHelper {
public:
APIPlaintextFrameHelper(std::unique_ptr<socket::Socket> socket, const ClientInfo *client_info)
: APIFrameHelper(std::move(socket), client_info) {
// Plaintext header structure (worst case):
// Pos 0: indicator (0x00)
// Pos 1-3: payload size varint (up to 3 bytes)
// Pos 4-5: message type varint (up to 2 bytes)
// Pos 6+: actual payload data
frame_header_padding_ = 6;
}
~APIPlaintextFrameHelper() override = default;
APIError init() override;
APIError loop() override;
APIError read_packet(ReadPacketBuffer *buffer) override;
APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) override;
APIError write_protobuf_packets(ProtoWriteBuffer buffer, std::span<const PacketInfo> packets) override;
protected:
APIError try_read_frame_();
// Group 2-byte aligned types
uint16_t rx_header_parsed_type_ = 0;
uint16_t rx_header_parsed_len_ = 0;
// Group 1-byte types together
// Fixed-size header buffer for plaintext protocol:
// We now store the indicator byte + the two varints.
// To match noise protocol's maximum message size (UINT16_MAX = 65535), we need:
// 1 byte for indicator + 3 bytes for message size varint (supports up to 2097151) + 2 bytes for message type varint
//
// While varints could theoretically be up to 10 bytes each for 64-bit values,
// attempting to process messages with headers that large would likely crash the
// ESP32 due to memory constraints.
uint8_t rx_header_buf_[6]; // 1 byte indicator + 5 bytes for varints (3 for size + 2 for type)
uint8_t rx_header_buf_pos_ = 0;
bool rx_header_parsed_ = false;
// 8 bytes total, no padding needed
};
} // namespace esphome::api
#endif // USE_API_PLAINTEXT
#endif // USE_API

View File

@@ -3,7 +3,8 @@
#include <cstdint>
#include "esphome/core/defines.h"
namespace esphome::api {
namespace esphome {
namespace api {
#ifdef USE_API_NOISE
using psk_t = std::array<uint8_t, 32>;
@@ -27,4 +28,5 @@ class APINoiseContext {
};
#endif // USE_API_NOISE
} // namespace esphome::api
} // namespace api
} // namespace esphome

View File

@@ -27,41 +27,4 @@ extend google.protobuf.MessageOptions {
extend google.protobuf.FieldOptions {
optional string field_ifdef = 1042;
optional uint32 fixed_array_size = 50007;
optional bool no_zero_copy = 50008 [default=false];
optional bool fixed_array_skip_zero = 50009 [default=false];
optional string fixed_array_size_define = 50010;
optional string fixed_array_with_length_define = 50011;
// pointer_to_buffer: Use pointer instead of array for fixed-size byte fields
// When set, the field will be declared as a pointer (const uint8_t *data)
// instead of an array (uint8_t data[N]). This allows zero-copy on decode
// by pointing directly to the protobuf buffer. The buffer must remain valid
// until the message is processed (which is guaranteed for stack-allocated messages).
optional bool pointer_to_buffer = 50012 [default=false];
// container_pointer: Zero-copy optimization for repeated fields.
//
// When container_pointer is set on a repeated field, the generated message will
// store a pointer to an existing container instead of copying the data into the
// message's own repeated field. This eliminates heap allocations and improves performance.
//
// Requirements for safe usage:
// 1. The source container must remain valid until the message is encoded
// 2. Messages must be encoded immediately (which ESPHome does by default)
// 3. The container type must match the field type exactly
//
// Supported container types:
// - "std::vector<T>" for most repeated fields
// - "std::set<T>" for unique/sorted data
// - Full type specification required for enums (e.g., "std::set<climate::ClimateMode>")
//
// Example usage in .proto file:
// repeated string supported_modes = 12 [(container_pointer) = "std::set"];
// repeated ColorMode color_modes = 13 [(container_pointer) = "std::set<light::ColorMode>"];
//
// The corresponding C++ code must provide const reference access to a container
// that matches the specified type and remains valid during message encoding.
// This is typically done through methods returning const T& or special accessor
// methods like get_options() or supported_modes_for_api_().
optional string container_pointer = 50001;
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,34 +0,0 @@
#pragma once
#include "esphome/core/defines.h"
// This file provides includes needed by the generated protobuf code
// when using pointer optimizations for component-specific types
#ifdef USE_CLIMATE
#include "esphome/components/climate/climate_mode.h"
#include "esphome/components/climate/climate_traits.h"
#endif
#ifdef USE_LIGHT
#include "esphome/components/light/light_traits.h"
#endif
#ifdef USE_FAN
#include "esphome/components/fan/fan_traits.h"
#endif
#ifdef USE_SELECT
#include "esphome/components/select/select_traits.h"
#endif
// Standard library includes that might be needed
#include <set>
#include <vector>
#include <string>
namespace esphome::api {
// This file only provides includes, no actual code
} // namespace esphome::api

View File

@@ -3,7 +3,8 @@
#include "api_pb2_service.h"
#include "esphome/core/log.h"
namespace esphome::api {
namespace esphome {
namespace api {
static const char *const TAG = "api.service";
@@ -15,7 +16,7 @@ void APIServerConnectionBase::log_send_message_(const char *name, const std::str
void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type, uint8_t *msg_data) {
switch (msg_type) {
case HelloRequest::MESSAGE_TYPE: {
case 1: {
HelloRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -24,81 +25,79 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
this->on_hello_request(msg);
break;
}
#ifdef USE_API_PASSWORD
case AuthenticationRequest::MESSAGE_TYPE: {
AuthenticationRequest msg;
case 3: {
ConnectRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_authentication_request: %s", msg.dump().c_str());
ESP_LOGVV(TAG, "on_connect_request: %s", msg.dump().c_str());
#endif
this->on_authentication_request(msg);
this->on_connect_request(msg);
break;
}
#endif
case DisconnectRequest::MESSAGE_TYPE: {
case 5: {
DisconnectRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_disconnect_request: %s", msg.dump().c_str());
#endif
this->on_disconnect_request(msg);
break;
}
case DisconnectResponse::MESSAGE_TYPE: {
case 6: {
DisconnectResponse msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_disconnect_response: %s", msg.dump().c_str());
#endif
this->on_disconnect_response(msg);
break;
}
case PingRequest::MESSAGE_TYPE: {
case 7: {
PingRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_ping_request: %s", msg.dump().c_str());
#endif
this->on_ping_request(msg);
break;
}
case PingResponse::MESSAGE_TYPE: {
case 8: {
PingResponse msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_ping_response: %s", msg.dump().c_str());
#endif
this->on_ping_response(msg);
break;
}
case DeviceInfoRequest::MESSAGE_TYPE: {
case 9: {
DeviceInfoRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_device_info_request: %s", msg.dump().c_str());
#endif
this->on_device_info_request(msg);
break;
}
case ListEntitiesRequest::MESSAGE_TYPE: {
case 11: {
ListEntitiesRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_list_entities_request: %s", msg.dump().c_str());
#endif
this->on_list_entities_request(msg);
break;
}
case SubscribeStatesRequest::MESSAGE_TYPE: {
case 20: {
SubscribeStatesRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_subscribe_states_request: %s", msg.dump().c_str());
#endif
this->on_subscribe_states_request(msg);
break;
}
case SubscribeLogsRequest::MESSAGE_TYPE: {
case 28: {
SubscribeLogsRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -108,7 +107,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
break;
}
#ifdef USE_COVER
case CoverCommandRequest::MESSAGE_TYPE: {
case 30: {
CoverCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -119,7 +118,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_FAN
case FanCommandRequest::MESSAGE_TYPE: {
case 31: {
FanCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -130,7 +129,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_LIGHT
case LightCommandRequest::MESSAGE_TYPE: {
case 32: {
LightCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -141,7 +140,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_SWITCH
case SwitchCommandRequest::MESSAGE_TYPE: {
case 33: {
SwitchCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -151,18 +150,25 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
break;
}
#endif
#ifdef USE_API_HOMEASSISTANT_SERVICES
case SubscribeHomeassistantServicesRequest::MESSAGE_TYPE: {
case 34: {
SubscribeHomeassistantServicesRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_subscribe_homeassistant_services_request: %s", msg.dump().c_str());
#endif
this->on_subscribe_homeassistant_services_request(msg);
break;
}
case 36: {
GetTimeRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_get_time_request: %s", msg.dump().c_str());
#endif
case GetTimeResponse::MESSAGE_TYPE: {
this->on_get_time_request(msg);
break;
}
case 37: {
GetTimeResponse msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -171,19 +177,16 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
this->on_get_time_response(msg);
break;
}
#ifdef USE_API_HOMEASSISTANT_STATES
case SubscribeHomeAssistantStatesRequest::MESSAGE_TYPE: {
case 38: {
SubscribeHomeAssistantStatesRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_subscribe_home_assistant_states_request: %s", msg.dump().c_str());
#endif
this->on_subscribe_home_assistant_states_request(msg);
break;
}
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
case HomeAssistantStateResponse::MESSAGE_TYPE: {
case 40: {
HomeAssistantStateResponse msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -192,9 +195,8 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
this->on_home_assistant_state_response(msg);
break;
}
#endif
#ifdef USE_API_SERVICES
case ExecuteServiceRequest::MESSAGE_TYPE: {
case 42: {
ExecuteServiceRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -205,7 +207,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_CAMERA
case CameraImageRequest::MESSAGE_TYPE: {
case 45: {
CameraImageRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -216,7 +218,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_CLIMATE
case ClimateCommandRequest::MESSAGE_TYPE: {
case 48: {
ClimateCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -227,7 +229,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_NUMBER
case NumberCommandRequest::MESSAGE_TYPE: {
case 51: {
NumberCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -238,7 +240,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_SELECT
case SelectCommandRequest::MESSAGE_TYPE: {
case 54: {
SelectCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -249,7 +251,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_SIREN
case SirenCommandRequest::MESSAGE_TYPE: {
case 57: {
SirenCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -260,7 +262,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_LOCK
case LockCommandRequest::MESSAGE_TYPE: {
case 60: {
LockCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -271,7 +273,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BUTTON
case ButtonCommandRequest::MESSAGE_TYPE: {
case 62: {
ButtonCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -282,7 +284,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_MEDIA_PLAYER
case MediaPlayerCommandRequest::MESSAGE_TYPE: {
case 65: {
MediaPlayerCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -293,7 +295,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case SubscribeBluetoothLEAdvertisementsRequest::MESSAGE_TYPE: {
case 66: {
SubscribeBluetoothLEAdvertisementsRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -304,7 +306,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothDeviceRequest::MESSAGE_TYPE: {
case 68: {
BluetoothDeviceRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -315,7 +317,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothGATTGetServicesRequest::MESSAGE_TYPE: {
case 70: {
BluetoothGATTGetServicesRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -326,7 +328,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothGATTReadRequest::MESSAGE_TYPE: {
case 73: {
BluetoothGATTReadRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -337,7 +339,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothGATTWriteRequest::MESSAGE_TYPE: {
case 75: {
BluetoothGATTWriteRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -348,7 +350,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothGATTReadDescriptorRequest::MESSAGE_TYPE: {
case 76: {
BluetoothGATTReadDescriptorRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -359,7 +361,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothGATTWriteDescriptorRequest::MESSAGE_TYPE: {
case 77: {
BluetoothGATTWriteDescriptorRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -370,7 +372,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothGATTNotifyRequest::MESSAGE_TYPE: {
case 78: {
BluetoothGATTNotifyRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -381,9 +383,9 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case SubscribeBluetoothConnectionsFreeRequest::MESSAGE_TYPE: {
case 80: {
SubscribeBluetoothConnectionsFreeRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_subscribe_bluetooth_connections_free_request: %s", msg.dump().c_str());
#endif
@@ -392,9 +394,9 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case UnsubscribeBluetoothLEAdvertisementsRequest::MESSAGE_TYPE: {
case 87: {
UnsubscribeBluetoothLEAdvertisementsRequest msg;
// Empty message: no decode needed
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_unsubscribe_bluetooth_le_advertisements_request: %s", msg.dump().c_str());
#endif
@@ -403,7 +405,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case SubscribeVoiceAssistantRequest::MESSAGE_TYPE: {
case 89: {
SubscribeVoiceAssistantRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -414,7 +416,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case VoiceAssistantResponse::MESSAGE_TYPE: {
case 91: {
VoiceAssistantResponse msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -425,7 +427,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case VoiceAssistantEventResponse::MESSAGE_TYPE: {
case 92: {
VoiceAssistantEventResponse msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -436,7 +438,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_ALARM_CONTROL_PANEL
case AlarmControlPanelCommandRequest::MESSAGE_TYPE: {
case 96: {
AlarmControlPanelCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -447,7 +449,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_TEXT
case TextCommandRequest::MESSAGE_TYPE: {
case 99: {
TextCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -458,7 +460,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_DATETIME_DATE
case DateCommandRequest::MESSAGE_TYPE: {
case 102: {
DateCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -469,7 +471,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_DATETIME_TIME
case TimeCommandRequest::MESSAGE_TYPE: {
case 105: {
TimeCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -480,7 +482,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case VoiceAssistantAudio::MESSAGE_TYPE: {
case 106: {
VoiceAssistantAudio msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -491,7 +493,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VALVE
case ValveCommandRequest::MESSAGE_TYPE: {
case 111: {
ValveCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -502,7 +504,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_DATETIME_DATETIME
case DateTimeCommandRequest::MESSAGE_TYPE: {
case 114: {
DateTimeCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -513,7 +515,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case VoiceAssistantTimerEventResponse::MESSAGE_TYPE: {
case 115: {
VoiceAssistantTimerEventResponse msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -524,7 +526,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_UPDATE
case UpdateCommandRequest::MESSAGE_TYPE: {
case 118: {
UpdateCommandRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -535,7 +537,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case VoiceAssistantAnnounceRequest::MESSAGE_TYPE: {
case 119: {
VoiceAssistantAnnounceRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -546,7 +548,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case VoiceAssistantConfigurationRequest::MESSAGE_TYPE: {
case 121: {
VoiceAssistantConfigurationRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -557,7 +559,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_VOICE_ASSISTANT
case VoiceAssistantSetConfiguration::MESSAGE_TYPE: {
case 123: {
VoiceAssistantSetConfiguration msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -568,7 +570,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_API_NOISE
case NoiseEncryptionSetKeyRequest::MESSAGE_TYPE: {
case 124: {
NoiseEncryptionSetKeyRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -579,7 +581,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
#endif
#ifdef USE_BLUETOOTH_PROXY
case BluetoothScannerSetModeRequest::MESSAGE_TYPE: {
case 127: {
BluetoothScannerSetModeRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
@@ -588,39 +590,6 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
this->on_bluetooth_scanner_set_mode_request(msg);
break;
}
#endif
#ifdef USE_ZWAVE_PROXY
case ZWaveProxyFrame::MESSAGE_TYPE: {
ZWaveProxyFrame msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_z_wave_proxy_frame: %s", msg.dump().c_str());
#endif
this->on_z_wave_proxy_frame(msg);
break;
}
#endif
#ifdef USE_ZWAVE_PROXY
case ZWaveProxyRequest::MESSAGE_TYPE: {
ZWaveProxyRequest msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_z_wave_proxy_request: %s", msg.dump().c_str());
#endif
this->on_z_wave_proxy_request(msg);
break;
}
#endif
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
case HomeassistantActionResponse::MESSAGE_TYPE: {
HomeassistantActionResponse msg;
msg.decode(msg_data, msg_size);
#ifdef HAS_PROTO_MESSAGE_DUMP
ESP_LOGVV(TAG, "on_homeassistant_action_response: %s", msg.dump().c_str());
#endif
this->on_homeassistant_action_response(msg);
break;
}
#endif
default:
break;
@@ -628,230 +597,328 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
}
void APIServerConnection::on_hello_request(const HelloRequest &msg) {
if (!this->send_hello_response(msg)) {
HelloResponse ret = this->hello(msg);
if (!this->send_message(ret, HelloResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
#ifdef USE_API_PASSWORD
void APIServerConnection::on_authentication_request(const AuthenticationRequest &msg) {
if (!this->send_authenticate_response(msg)) {
void APIServerConnection::on_connect_request(const ConnectRequest &msg) {
ConnectResponse ret = this->connect(msg);
if (!this->send_message(ret, ConnectResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
#endif
void APIServerConnection::on_disconnect_request(const DisconnectRequest &msg) {
if (!this->send_disconnect_response(msg)) {
DisconnectResponse ret = this->disconnect(msg);
if (!this->send_message(ret, DisconnectResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
void APIServerConnection::on_ping_request(const PingRequest &msg) {
if (!this->send_ping_response(msg)) {
PingResponse ret = this->ping(msg);
if (!this->send_message(ret, PingResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
void APIServerConnection::on_device_info_request(const DeviceInfoRequest &msg) {
if (!this->send_device_info_response(msg)) {
this->on_fatal_error();
if (this->check_connection_setup_()) {
DeviceInfoResponse ret = this->device_info(msg);
if (!this->send_message(ret, DeviceInfoResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
}
void APIServerConnection::on_list_entities_request(const ListEntitiesRequest &msg) { this->list_entities(msg); }
void APIServerConnection::on_subscribe_states_request(const SubscribeStatesRequest &msg) {
this->subscribe_states(msg);
void APIServerConnection::on_list_entities_request(const ListEntitiesRequest &msg) {
if (this->check_authenticated_()) {
this->list_entities(msg);
}
}
void APIServerConnection::on_subscribe_states_request(const SubscribeStatesRequest &msg) {
if (this->check_authenticated_()) {
this->subscribe_states(msg);
}
}
void APIServerConnection::on_subscribe_logs_request(const SubscribeLogsRequest &msg) {
if (this->check_authenticated_()) {
this->subscribe_logs(msg);
}
}
void APIServerConnection::on_subscribe_logs_request(const SubscribeLogsRequest &msg) { this->subscribe_logs(msg); }
#ifdef USE_API_HOMEASSISTANT_SERVICES
void APIServerConnection::on_subscribe_homeassistant_services_request(
const SubscribeHomeassistantServicesRequest &msg) {
this->subscribe_homeassistant_services(msg);
if (this->check_authenticated_()) {
this->subscribe_homeassistant_services(msg);
}
}
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
void APIServerConnection::on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) {
this->subscribe_home_assistant_states(msg);
if (this->check_authenticated_()) {
this->subscribe_home_assistant_states(msg);
}
}
void APIServerConnection::on_get_time_request(const GetTimeRequest &msg) {
if (this->check_connection_setup_()) {
GetTimeResponse ret = this->get_time(msg);
if (!this->send_message(ret, GetTimeResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
}
#endif
#ifdef USE_API_SERVICES
void APIServerConnection::on_execute_service_request(const ExecuteServiceRequest &msg) { this->execute_service(msg); }
void APIServerConnection::on_execute_service_request(const ExecuteServiceRequest &msg) {
if (this->check_authenticated_()) {
this->execute_service(msg);
}
}
#endif
#ifdef USE_API_NOISE
void APIServerConnection::on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &msg) {
if (!this->send_noise_encryption_set_key_response(msg)) {
this->on_fatal_error();
if (this->check_authenticated_()) {
NoiseEncryptionSetKeyResponse ret = this->noise_encryption_set_key(msg);
if (!this->send_message(ret, NoiseEncryptionSetKeyResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
}
#endif
#ifdef USE_BUTTON
void APIServerConnection::on_button_command_request(const ButtonCommandRequest &msg) { this->button_command(msg); }
void APIServerConnection::on_button_command_request(const ButtonCommandRequest &msg) {
if (this->check_authenticated_()) {
this->button_command(msg);
}
}
#endif
#ifdef USE_CAMERA
void APIServerConnection::on_camera_image_request(const CameraImageRequest &msg) { this->camera_image(msg); }
void APIServerConnection::on_camera_image_request(const CameraImageRequest &msg) {
if (this->check_authenticated_()) {
this->camera_image(msg);
}
}
#endif
#ifdef USE_CLIMATE
void APIServerConnection::on_climate_command_request(const ClimateCommandRequest &msg) { this->climate_command(msg); }
void APIServerConnection::on_climate_command_request(const ClimateCommandRequest &msg) {
if (this->check_authenticated_()) {
this->climate_command(msg);
}
}
#endif
#ifdef USE_COVER
void APIServerConnection::on_cover_command_request(const CoverCommandRequest &msg) { this->cover_command(msg); }
void APIServerConnection::on_cover_command_request(const CoverCommandRequest &msg) {
if (this->check_authenticated_()) {
this->cover_command(msg);
}
}
#endif
#ifdef USE_DATETIME_DATE
void APIServerConnection::on_date_command_request(const DateCommandRequest &msg) { this->date_command(msg); }
void APIServerConnection::on_date_command_request(const DateCommandRequest &msg) {
if (this->check_authenticated_()) {
this->date_command(msg);
}
}
#endif
#ifdef USE_DATETIME_DATETIME
void APIServerConnection::on_date_time_command_request(const DateTimeCommandRequest &msg) {
this->datetime_command(msg);
if (this->check_authenticated_()) {
this->datetime_command(msg);
}
}
#endif
#ifdef USE_FAN
void APIServerConnection::on_fan_command_request(const FanCommandRequest &msg) { this->fan_command(msg); }
void APIServerConnection::on_fan_command_request(const FanCommandRequest &msg) {
if (this->check_authenticated_()) {
this->fan_command(msg);
}
}
#endif
#ifdef USE_LIGHT
void APIServerConnection::on_light_command_request(const LightCommandRequest &msg) { this->light_command(msg); }
void APIServerConnection::on_light_command_request(const LightCommandRequest &msg) {
if (this->check_authenticated_()) {
this->light_command(msg);
}
}
#endif
#ifdef USE_LOCK
void APIServerConnection::on_lock_command_request(const LockCommandRequest &msg) { this->lock_command(msg); }
void APIServerConnection::on_lock_command_request(const LockCommandRequest &msg) {
if (this->check_authenticated_()) {
this->lock_command(msg);
}
}
#endif
#ifdef USE_MEDIA_PLAYER
void APIServerConnection::on_media_player_command_request(const MediaPlayerCommandRequest &msg) {
this->media_player_command(msg);
if (this->check_authenticated_()) {
this->media_player_command(msg);
}
}
#endif
#ifdef USE_NUMBER
void APIServerConnection::on_number_command_request(const NumberCommandRequest &msg) { this->number_command(msg); }
void APIServerConnection::on_number_command_request(const NumberCommandRequest &msg) {
if (this->check_authenticated_()) {
this->number_command(msg);
}
}
#endif
#ifdef USE_SELECT
void APIServerConnection::on_select_command_request(const SelectCommandRequest &msg) { this->select_command(msg); }
void APIServerConnection::on_select_command_request(const SelectCommandRequest &msg) {
if (this->check_authenticated_()) {
this->select_command(msg);
}
}
#endif
#ifdef USE_SIREN
void APIServerConnection::on_siren_command_request(const SirenCommandRequest &msg) { this->siren_command(msg); }
void APIServerConnection::on_siren_command_request(const SirenCommandRequest &msg) {
if (this->check_authenticated_()) {
this->siren_command(msg);
}
}
#endif
#ifdef USE_SWITCH
void APIServerConnection::on_switch_command_request(const SwitchCommandRequest &msg) { this->switch_command(msg); }
void APIServerConnection::on_switch_command_request(const SwitchCommandRequest &msg) {
if (this->check_authenticated_()) {
this->switch_command(msg);
}
}
#endif
#ifdef USE_TEXT
void APIServerConnection::on_text_command_request(const TextCommandRequest &msg) { this->text_command(msg); }
void APIServerConnection::on_text_command_request(const TextCommandRequest &msg) {
if (this->check_authenticated_()) {
this->text_command(msg);
}
}
#endif
#ifdef USE_DATETIME_TIME
void APIServerConnection::on_time_command_request(const TimeCommandRequest &msg) { this->time_command(msg); }
void APIServerConnection::on_time_command_request(const TimeCommandRequest &msg) {
if (this->check_authenticated_()) {
this->time_command(msg);
}
}
#endif
#ifdef USE_UPDATE
void APIServerConnection::on_update_command_request(const UpdateCommandRequest &msg) { this->update_command(msg); }
void APIServerConnection::on_update_command_request(const UpdateCommandRequest &msg) {
if (this->check_authenticated_()) {
this->update_command(msg);
}
}
#endif
#ifdef USE_VALVE
void APIServerConnection::on_valve_command_request(const ValveCommandRequest &msg) { this->valve_command(msg); }
void APIServerConnection::on_valve_command_request(const ValveCommandRequest &msg) {
if (this->check_authenticated_()) {
this->valve_command(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_subscribe_bluetooth_le_advertisements_request(
const SubscribeBluetoothLEAdvertisementsRequest &msg) {
this->subscribe_bluetooth_le_advertisements(msg);
if (this->check_authenticated_()) {
this->subscribe_bluetooth_le_advertisements(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_device_request(const BluetoothDeviceRequest &msg) {
this->bluetooth_device_request(msg);
if (this->check_authenticated_()) {
this->bluetooth_device_request(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_gatt_get_services_request(const BluetoothGATTGetServicesRequest &msg) {
this->bluetooth_gatt_get_services(msg);
if (this->check_authenticated_()) {
this->bluetooth_gatt_get_services(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_gatt_read_request(const BluetoothGATTReadRequest &msg) {
this->bluetooth_gatt_read(msg);
if (this->check_authenticated_()) {
this->bluetooth_gatt_read(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_gatt_write_request(const BluetoothGATTWriteRequest &msg) {
this->bluetooth_gatt_write(msg);
if (this->check_authenticated_()) {
this->bluetooth_gatt_write(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_gatt_read_descriptor_request(const BluetoothGATTReadDescriptorRequest &msg) {
this->bluetooth_gatt_read_descriptor(msg);
if (this->check_authenticated_()) {
this->bluetooth_gatt_read_descriptor(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_gatt_write_descriptor_request(const BluetoothGATTWriteDescriptorRequest &msg) {
this->bluetooth_gatt_write_descriptor(msg);
if (this->check_authenticated_()) {
this->bluetooth_gatt_write_descriptor(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_gatt_notify_request(const BluetoothGATTNotifyRequest &msg) {
this->bluetooth_gatt_notify(msg);
if (this->check_authenticated_()) {
this->bluetooth_gatt_notify(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_subscribe_bluetooth_connections_free_request(
const SubscribeBluetoothConnectionsFreeRequest &msg) {
if (!this->send_subscribe_bluetooth_connections_free_response(msg)) {
this->on_fatal_error();
if (this->check_authenticated_()) {
BluetoothConnectionsFreeResponse ret = this->subscribe_bluetooth_connections_free(msg);
if (!this->send_message(ret, BluetoothConnectionsFreeResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_unsubscribe_bluetooth_le_advertisements_request(
const UnsubscribeBluetoothLEAdvertisementsRequest &msg) {
this->unsubscribe_bluetooth_le_advertisements(msg);
if (this->check_authenticated_()) {
this->unsubscribe_bluetooth_le_advertisements(msg);
}
}
#endif
#ifdef USE_BLUETOOTH_PROXY
void APIServerConnection::on_bluetooth_scanner_set_mode_request(const BluetoothScannerSetModeRequest &msg) {
this->bluetooth_scanner_set_mode(msg);
if (this->check_authenticated_()) {
this->bluetooth_scanner_set_mode(msg);
}
}
#endif
#ifdef USE_VOICE_ASSISTANT
void APIServerConnection::on_subscribe_voice_assistant_request(const SubscribeVoiceAssistantRequest &msg) {
this->subscribe_voice_assistant(msg);
if (this->check_authenticated_()) {
this->subscribe_voice_assistant(msg);
}
}
#endif
#ifdef USE_VOICE_ASSISTANT
void APIServerConnection::on_voice_assistant_configuration_request(const VoiceAssistantConfigurationRequest &msg) {
if (!this->send_voice_assistant_get_configuration_response(msg)) {
this->on_fatal_error();
if (this->check_authenticated_()) {
VoiceAssistantConfigurationResponse ret = this->voice_assistant_get_configuration(msg);
if (!this->send_message(ret, VoiceAssistantConfigurationResponse::MESSAGE_TYPE)) {
this->on_fatal_error();
}
}
}
#endif
#ifdef USE_VOICE_ASSISTANT
void APIServerConnection::on_voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) {
this->voice_assistant_set_configuration(msg);
if (this->check_authenticated_()) {
this->voice_assistant_set_configuration(msg);
}
}
#endif
#ifdef USE_ALARM_CONTROL_PANEL
void APIServerConnection::on_alarm_control_panel_command_request(const AlarmControlPanelCommandRequest &msg) {
this->alarm_control_panel_command(msg);
}
#endif
#ifdef USE_ZWAVE_PROXY
void APIServerConnection::on_z_wave_proxy_frame(const ZWaveProxyFrame &msg) { this->zwave_proxy_frame(msg); }
#endif
#ifdef USE_ZWAVE_PROXY
void APIServerConnection::on_z_wave_proxy_request(const ZWaveProxyRequest &msg) { this->zwave_proxy_request(msg); }
#endif
void APIServerConnection::read_message(uint32_t msg_size, uint32_t msg_type, uint8_t *msg_data) {
// Check authentication/connection requirements for messages
switch (msg_type) {
case HelloRequest::MESSAGE_TYPE: // No setup required
#ifdef USE_API_PASSWORD
case AuthenticationRequest::MESSAGE_TYPE: // No setup required
#endif
case DisconnectRequest::MESSAGE_TYPE: // No setup required
case PingRequest::MESSAGE_TYPE: // No setup required
break; // Skip all checks for these messages
case DeviceInfoRequest::MESSAGE_TYPE: // Connection setup only
if (!this->check_connection_setup_()) {
return; // Connection not setup
}
break;
default:
// All other messages require authentication (which includes connection check)
if (!this->check_authenticated_()) {
return; // Authentication failed
}
break;
if (this->check_authenticated_()) {
this->alarm_control_panel_command(msg);
}
// Call base implementation to process the message
APIServerConnectionBase::read_message(msg_size, msg_type, msg_data);
}
#endif
} // namespace esphome::api
} // namespace api
} // namespace esphome

View File

@@ -6,7 +6,8 @@
#include "api_pb2.h"
namespace esphome::api {
namespace esphome {
namespace api {
class APIServerConnectionBase : public ProtoService {
public:
@@ -26,9 +27,7 @@ class APIServerConnectionBase : public ProtoService {
virtual void on_hello_request(const HelloRequest &value){};
#ifdef USE_API_PASSWORD
virtual void on_authentication_request(const AuthenticationRequest &value){};
#endif
virtual void on_connect_request(const ConnectRequest &value){};
virtual void on_disconnect_request(const DisconnectRequest &value){};
virtual void on_disconnect_response(const DisconnectResponse &value){};
@@ -62,21 +61,12 @@ class APIServerConnectionBase : public ProtoService {
virtual void on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &value){};
#endif
#ifdef USE_API_HOMEASSISTANT_SERVICES
virtual void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &value){};
#endif
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
virtual void on_homeassistant_action_response(const HomeassistantActionResponse &value){};
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
virtual void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &value){};
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
virtual void on_home_assistant_state_response(const HomeAssistantStateResponse &value){};
#endif
virtual void on_get_time_request(const GetTimeRequest &value){};
virtual void on_get_time_response(const GetTimeResponse &value){};
#ifdef USE_API_SERVICES
@@ -210,12 +200,6 @@ class APIServerConnectionBase : public ProtoService {
#ifdef USE_UPDATE
virtual void on_update_command_request(const UpdateCommandRequest &value){};
#endif
#ifdef USE_ZWAVE_PROXY
virtual void on_z_wave_proxy_frame(const ZWaveProxyFrame &value){};
#endif
#ifdef USE_ZWAVE_PROXY
virtual void on_z_wave_proxy_request(const ZWaveProxyRequest &value){};
#endif
protected:
void read_message(uint32_t msg_size, uint32_t msg_type, uint8_t *msg_data) override;
@@ -223,27 +207,22 @@ class APIServerConnectionBase : public ProtoService {
class APIServerConnection : public APIServerConnectionBase {
public:
virtual bool send_hello_response(const HelloRequest &msg) = 0;
#ifdef USE_API_PASSWORD
virtual bool send_authenticate_response(const AuthenticationRequest &msg) = 0;
#endif
virtual bool send_disconnect_response(const DisconnectRequest &msg) = 0;
virtual bool send_ping_response(const PingRequest &msg) = 0;
virtual bool send_device_info_response(const DeviceInfoRequest &msg) = 0;
virtual HelloResponse hello(const HelloRequest &msg) = 0;
virtual ConnectResponse connect(const ConnectRequest &msg) = 0;
virtual DisconnectResponse disconnect(const DisconnectRequest &msg) = 0;
virtual PingResponse ping(const PingRequest &msg) = 0;
virtual DeviceInfoResponse device_info(const DeviceInfoRequest &msg) = 0;
virtual void list_entities(const ListEntitiesRequest &msg) = 0;
virtual void subscribe_states(const SubscribeStatesRequest &msg) = 0;
virtual void subscribe_logs(const SubscribeLogsRequest &msg) = 0;
#ifdef USE_API_HOMEASSISTANT_SERVICES
virtual void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) = 0;
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
virtual void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) = 0;
#endif
virtual GetTimeResponse get_time(const GetTimeRequest &msg) = 0;
#ifdef USE_API_SERVICES
virtual void execute_service(const ExecuteServiceRequest &msg) = 0;
#endif
#ifdef USE_API_NOISE
virtual bool send_noise_encryption_set_key_response(const NoiseEncryptionSetKeyRequest &msg) = 0;
virtual NoiseEncryptionSetKeyResponse noise_encryption_set_key(const NoiseEncryptionSetKeyRequest &msg) = 0;
#endif
#ifdef USE_BUTTON
virtual void button_command(const ButtonCommandRequest &msg) = 0;
@@ -324,7 +303,7 @@ class APIServerConnection : public APIServerConnectionBase {
virtual void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) = 0;
#endif
#ifdef USE_BLUETOOTH_PROXY
virtual bool send_subscribe_bluetooth_connections_free_response(
virtual BluetoothConnectionsFreeResponse subscribe_bluetooth_connections_free(
const SubscribeBluetoothConnectionsFreeRequest &msg) = 0;
#endif
#ifdef USE_BLUETOOTH_PROXY
@@ -337,37 +316,27 @@ class APIServerConnection : public APIServerConnectionBase {
virtual void subscribe_voice_assistant(const SubscribeVoiceAssistantRequest &msg) = 0;
#endif
#ifdef USE_VOICE_ASSISTANT
virtual bool send_voice_assistant_get_configuration_response(const VoiceAssistantConfigurationRequest &msg) = 0;
virtual VoiceAssistantConfigurationResponse voice_assistant_get_configuration(
const VoiceAssistantConfigurationRequest &msg) = 0;
#endif
#ifdef USE_VOICE_ASSISTANT
virtual void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) = 0;
#endif
#ifdef USE_ALARM_CONTROL_PANEL
virtual void alarm_control_panel_command(const AlarmControlPanelCommandRequest &msg) = 0;
#endif
#ifdef USE_ZWAVE_PROXY
virtual void zwave_proxy_frame(const ZWaveProxyFrame &msg) = 0;
#endif
#ifdef USE_ZWAVE_PROXY
virtual void zwave_proxy_request(const ZWaveProxyRequest &msg) = 0;
#endif
protected:
void on_hello_request(const HelloRequest &msg) override;
#ifdef USE_API_PASSWORD
void on_authentication_request(const AuthenticationRequest &msg) override;
#endif
void on_connect_request(const ConnectRequest &msg) override;
void on_disconnect_request(const DisconnectRequest &msg) override;
void on_ping_request(const PingRequest &msg) override;
void on_device_info_request(const DeviceInfoRequest &msg) override;
void on_list_entities_request(const ListEntitiesRequest &msg) override;
void on_subscribe_states_request(const SubscribeStatesRequest &msg) override;
void on_subscribe_logs_request(const SubscribeLogsRequest &msg) override;
#ifdef USE_API_HOMEASSISTANT_SERVICES
void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &msg) override;
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) override;
#endif
void on_get_time_request(const GetTimeRequest &msg) override;
#ifdef USE_API_SERVICES
void on_execute_service_request(const ExecuteServiceRequest &msg) override;
#endif
@@ -474,13 +443,7 @@ class APIServerConnection : public APIServerConnectionBase {
#ifdef USE_ALARM_CONTROL_PANEL
void on_alarm_control_panel_command_request(const AlarmControlPanelCommandRequest &msg) override;
#endif
#ifdef USE_ZWAVE_PROXY
void on_z_wave_proxy_frame(const ZWaveProxyFrame &msg) override;
#endif
#ifdef USE_ZWAVE_PROXY
void on_z_wave_proxy_request(const ZWaveProxyRequest &msg) override;
#endif
void read_message(uint32_t msg_size, uint32_t msg_type, uint8_t *msg_data) override;
};
} // namespace esphome::api
} // namespace api
} // namespace esphome

View File

@@ -9,18 +9,15 @@
#include "esphome/core/log.h"
#include "esphome/core/util.h"
#include "esphome/core/version.h"
#ifdef USE_API_HOMEASSISTANT_SERVICES
#include "homeassistant_service.h"
#endif
#ifdef USE_LOGGER
#include "esphome/components/logger/logger.h"
#endif
#include <algorithm>
#include <utility>
namespace esphome::api {
namespace esphome {
namespace api {
static const char *const TAG = "api";
@@ -41,14 +38,12 @@ void APIServer::setup() {
this->noise_pref_ = global_preferences->make_preference<SavedNoisePsk>(hash, true);
#ifndef USE_API_NOISE_PSK_FROM_YAML
// Only load saved PSK if not set from YAML
SavedNoisePsk noise_pref_saved{};
if (this->noise_pref_.load(&noise_pref_saved)) {
ESP_LOGD(TAG, "Loaded saved Noise PSK");
this->set_noise_psk(noise_pref_saved.psk);
}
#endif
#endif
// Schedule reboot if no clients connect within timeout
@@ -91,7 +86,7 @@ void APIServer::setup() {
return;
}
err = this->socket_->listen(this->listen_backlog_);
err = this->socket_->listen(4);
if (err != 0) {
ESP_LOGW(TAG, "Socket unable to listen: errno %d", errno);
this->mark_failed();
@@ -144,19 +139,9 @@ void APIServer::loop() {
while (true) {
struct sockaddr_storage source_addr;
socklen_t addr_len = sizeof(source_addr);
auto sock = this->socket_->accept_loop_monitored((struct sockaddr *) &source_addr, &addr_len);
if (!sock)
break;
// Check if we're at the connection limit
if (this->clients_.size() >= this->max_connections_) {
ESP_LOGW(TAG, "Max connections (%d), rejecting %s", this->max_connections_, sock->getpeername().c_str());
// Immediately close - socket destructor will handle cleanup
sock.reset();
continue;
}
ESP_LOGD(TAG, "Accept %s", sock->getpeername().c_str());
auto *conn = new APIConnection(std::move(sock), this);
@@ -181,8 +166,7 @@ void APIServer::loop() {
// Network is down - disconnect all clients
for (auto &client : this->clients_) {
client->on_fatal_error();
ESP_LOGW(TAG, "%s (%s): Network down; disconnect", client->client_info_.name.c_str(),
client->client_info_.peername.c_str());
ESP_LOGW(TAG, "%s: Network down; disconnect", client->get_client_combined_info().c_str());
}
// Continue to process and clean up the clients below
}
@@ -200,9 +184,9 @@ void APIServer::loop() {
// Rare case: handle disconnection
#ifdef USE_API_CLIENT_DISCONNECTED_TRIGGER
this->client_disconnected_trigger_->trigger(client->client_info_.name, client->client_info_.peername);
this->client_disconnected_trigger_->trigger(client->client_info_, client->client_peername_);
#endif
ESP_LOGV(TAG, "Remove connection %s", client->client_info_.name.c_str());
ESP_LOGV(TAG, "Remove connection %s", client->client_info_.c_str());
// Swap with the last element and pop (avoids expensive vector shifts)
if (client_index < this->clients_.size() - 1) {
@@ -221,10 +205,8 @@ void APIServer::loop() {
void APIServer::dump_config() {
ESP_LOGCONFIG(TAG,
"Server:\n"
" Address: %s:%u\n"
" Listen backlog: %u\n"
" Max connections: %u",
network::get_use_address().c_str(), this->port_, this->listen_backlog_, this->max_connections_);
" Address: %s:%u",
network::get_use_address().c_str(), this->port_);
#ifdef USE_API_NOISE
ESP_LOGCONFIG(TAG, " Noise encryption: %s", YESNO(this->noise_ctx_->has_psk()));
if (!this->noise_ctx_->has_psk()) {
@@ -236,12 +218,12 @@ void APIServer::dump_config() {
}
#ifdef USE_API_PASSWORD
bool APIServer::check_password(const uint8_t *password_data, size_t password_len) const {
bool APIServer::check_password(const std::string &password) const {
// depend only on input password length
const char *a = this->password_.c_str();
uint32_t len_a = this->password_.length();
const char *b = reinterpret_cast<const char *>(password_data);
uint32_t len_b = password_len;
const char *b = password.c_str();
uint32_t len_b = password.length();
// disable optimization with volatile
volatile uint32_t length = len_b;
@@ -264,7 +246,6 @@ bool APIServer::check_password(const uint8_t *password_data, size_t password_len
return result == 0;
}
#endif
void APIServer::handle_disconnect(APIConnection *conn) {}
@@ -375,15 +356,6 @@ void APIServer::on_update(update::UpdateEntity *obj) {
}
#endif
#ifdef USE_ZWAVE_PROXY
void APIServer::on_zwave_proxy_request(const esphome::api::ProtoMessage &msg) {
// We could add code to manage a second subscription type, but, since this message type is
// very infrequent and small, we simply send it to all clients
for (auto &c : this->clients_)
c->send_message(msg, api::ZWaveProxyRequest::MESSAGE_TYPE);
}
#endif
#ifdef USE_ALARM_CONTROL_PANEL
API_DISPATCH_UPDATE(alarm_control_panel::AlarmControlPanel, alarm_control_panel)
#endif
@@ -398,46 +370,12 @@ void APIServer::set_password(const std::string &password) { this->password_ = pa
void APIServer::set_batch_delay(uint16_t batch_delay) { this->batch_delay_ = batch_delay; }
#ifdef USE_API_HOMEASSISTANT_SERVICES
void APIServer::send_homeassistant_action(const HomeassistantActionRequest &call) {
void APIServer::send_homeassistant_service_call(const HomeassistantServiceResponse &call) {
for (auto &client : this->clients_) {
client->send_homeassistant_action(call);
client->send_homeassistant_service_call(call);
}
}
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
void APIServer::register_action_response_callback(uint32_t call_id, ActionResponseCallback callback) {
this->action_response_callbacks_.push_back({call_id, std::move(callback)});
}
void APIServer::handle_action_response(uint32_t call_id, bool success, const std::string &error_message) {
for (auto it = this->action_response_callbacks_.begin(); it != this->action_response_callbacks_.end(); ++it) {
if (it->call_id == call_id) {
auto callback = std::move(it->callback);
this->action_response_callbacks_.erase(it);
ActionResponse response(success, error_message);
callback(response);
return;
}
}
}
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
void APIServer::handle_action_response(uint32_t call_id, bool success, const std::string &error_message,
const uint8_t *response_data, size_t response_data_len) {
for (auto it = this->action_response_callbacks_.begin(); it != this->action_response_callbacks_.end(); ++it) {
if (it->call_id == call_id) {
auto callback = std::move(it->callback);
this->action_response_callbacks_.erase(it);
ActionResponse response(success, error_message, response_data, response_data_len);
callback(response);
return;
}
}
}
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
#endif // USE_API_HOMEASSISTANT_SERVICES
#ifdef USE_API_HOMEASSISTANT_STATES
void APIServer::subscribe_home_assistant_state(std::string entity_id, optional<std::string> attribute,
std::function<void(std::string)> f) {
this->state_subs_.push_back(HomeAssistantStateSubscription{
@@ -461,7 +399,6 @@ void APIServer::get_home_assistant_state(std::string entity_id, optional<std::st
const std::vector<APIServer::HomeAssistantStateSubscription> &APIServer::get_state_subs() const {
return this->state_subs_;
}
#endif
uint16_t APIServer::get_port() const { return this->port_; }
@@ -469,12 +406,6 @@ void APIServer::set_reboot_timeout(uint32_t reboot_timeout) { this->reboot_timeo
#ifdef USE_API_NOISE
bool APIServer::save_noise_psk(psk_t psk, bool make_active) {
#ifdef USE_API_NOISE_PSK_FROM_YAML
// When PSK is set from YAML, this function should never be called
// but if it is, reject the change
ESP_LOGW(TAG, "Key set in YAML");
return false;
#else
auto &old_psk = this->noise_ctx_->get_psk();
if (std::equal(old_psk.begin(), old_psk.end(), psk.begin())) {
ESP_LOGW(TAG, "New PSK matches old");
@@ -503,7 +434,6 @@ bool APIServer::save_noise_psk(psk_t psk, bool make_active) {
});
}
return true;
#endif
}
#endif
@@ -553,5 +483,6 @@ bool APIServer::teardown() {
return this->clients_.empty();
}
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

View File

@@ -16,10 +16,10 @@
#include "user_services.h"
#endif
#include <map>
#include <vector>
namespace esphome::api {
namespace esphome {
namespace api {
#ifdef USE_API_NOISE
struct SavedNoisePsk {
@@ -38,15 +38,13 @@ class APIServer : public Component, public Controller {
void on_shutdown() override;
bool teardown() override;
#ifdef USE_API_PASSWORD
bool check_password(const uint8_t *password_data, size_t password_len) const;
bool check_password(const std::string &password) const;
void set_password(const std::string &password);
#endif
void set_port(uint16_t port);
void set_reboot_timeout(uint32_t reboot_timeout);
void set_batch_delay(uint16_t batch_delay);
uint16_t get_batch_delay() const { return batch_delay_; }
void set_listen_backlog(uint8_t listen_backlog) { this->listen_backlog_ = listen_backlog; }
void set_max_connections(uint8_t max_connections) { this->max_connections_ = max_connections; }
// Get reference to shared buffer for API connections
std::vector<uint8_t> &get_shared_buffer_ref() { return shared_write_buffer_; }
@@ -109,20 +107,7 @@ class APIServer : public Component, public Controller {
#ifdef USE_MEDIA_PLAYER
void on_media_player_update(media_player::MediaPlayer *obj) override;
#endif
#ifdef USE_API_HOMEASSISTANT_SERVICES
void send_homeassistant_action(const HomeassistantActionRequest &call);
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
// Action response handling
using ActionResponseCallback = std::function<void(const class ActionResponse &)>;
void register_action_response_callback(uint32_t call_id, ActionResponseCallback callback);
void handle_action_response(uint32_t call_id, bool success, const std::string &error_message);
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
void handle_action_response(uint32_t call_id, bool success, const std::string &error_message,
const uint8_t *response_data, size_t response_data_len);
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
#endif // USE_API_HOMEASSISTANT_SERVICES
void send_homeassistant_service_call(const HomeassistantServiceResponse &call);
#ifdef USE_API_SERVICES
void register_user_service(UserServiceDescriptor *descriptor) { this->user_services_.push_back(descriptor); }
#endif
@@ -139,13 +124,9 @@ class APIServer : public Component, public Controller {
#ifdef USE_UPDATE
void on_update(update::UpdateEntity *obj) override;
#endif
#ifdef USE_ZWAVE_PROXY
void on_zwave_proxy_request(const esphome::api::ProtoMessage &msg);
#endif
bool is_connected() const;
#ifdef USE_API_HOMEASSISTANT_STATES
struct HomeAssistantStateSubscription {
std::string entity_id;
optional<std::string> attribute;
@@ -158,7 +139,6 @@ class APIServer : public Component, public Controller {
void get_home_assistant_state(std::string entity_id, optional<std::string> attribute,
std::function<void(std::string)> f);
const std::vector<HomeAssistantStateSubscription> &get_state_subs() const;
#endif
#ifdef USE_API_SERVICES
const std::vector<UserServiceDescriptor *> &get_user_services() const { return this->user_services_; }
#endif
@@ -192,29 +172,16 @@ class APIServer : public Component, public Controller {
std::string password_;
#endif
std::vector<uint8_t> shared_write_buffer_; // Shared proto write buffer for all connections
#ifdef USE_API_HOMEASSISTANT_STATES
std::vector<HomeAssistantStateSubscription> state_subs_;
#endif
#ifdef USE_API_SERVICES
std::vector<UserServiceDescriptor *> user_services_;
#endif
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
struct PendingActionResponse {
uint32_t call_id;
ActionResponseCallback callback;
};
std::vector<PendingActionResponse> action_response_callbacks_;
#endif
// Group smaller types together
uint16_t port_{6053};
uint16_t batch_delay_{100};
// Connection limits - these defaults will be overridden by config values
// from cv.SplitDefault in __init__.py which sets platform-specific defaults
uint8_t listen_backlog_{4};
uint8_t max_connections_{8};
bool shutting_down_ = false;
// 7 bytes used, 1 byte padding
// 5 bytes used, 3 bytes padding
#ifdef USE_API_NOISE
std::shared_ptr<APINoiseContext> noise_ctx_ = std::make_shared<APINoiseContext>();
@@ -229,5 +196,6 @@ template<typename... Ts> class APIConnectedCondition : public Condition<Ts...> {
bool check(Ts... x) override { return global_api_server->is_connected(); }
};
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

View File

@@ -14,8 +14,6 @@ with warnings.catch_warnings():
from aioesphomeapi import APIClient, parse_log_message
from aioesphomeapi.log_runner import async_run
import contextlib
from esphome.const import CONF_KEY, CONF_PASSWORD, CONF_PORT, __version__
from esphome.core import CORE
@@ -30,7 +28,7 @@ if TYPE_CHECKING:
_LOGGER = logging.getLogger(__name__)
async def async_run_logs(config: dict[str, Any], addresses: list[str]) -> None:
async def async_run_logs(config: dict[str, Any], address: str) -> None:
"""Run the logs command in the event loop."""
conf = config["api"]
name = config["esphome"]["name"]
@@ -39,21 +37,13 @@ async def async_run_logs(config: dict[str, Any], addresses: list[str]) -> None:
noise_psk: str | None = None
if (encryption := conf.get(CONF_ENCRYPTION)) and (key := encryption.get(CONF_KEY)):
noise_psk = key
if len(addresses) == 1:
_LOGGER.info("Starting log output from %s using esphome API", addresses[0])
else:
_LOGGER.info(
"Starting log output from %s using esphome API", " or ".join(addresses)
)
_LOGGER.info("Starting log output from %s using esphome API", address)
cli = APIClient(
addresses[0], # Primary address for compatibility
address,
port,
password,
client_info=f"ESPHome Logs {__version__}",
noise_psk=noise_psk,
addresses=addresses, # Pass all addresses for automatic retry
)
dashboard = CORE.dashboard
@@ -62,11 +52,9 @@ async def async_run_logs(config: dict[str, Any], addresses: list[str]) -> None:
time_ = datetime.now()
message: bytes = msg.message
text = message.decode("utf8", "backslashreplace")
nanoseconds = time_.microsecond // 1000
timestamp = (
f"[{time_.hour:02}:{time_.minute:02}:{time_.second:02}.{nanoseconds:03}]"
)
for parsed_msg in parse_log_message(text, timestamp):
for parsed_msg in parse_log_message(
text, f"[{time_.hour:02}:{time_.minute:02}:{time_.second:02}]"
):
print(parsed_msg.replace("\033", "\\033") if dashboard else parsed_msg)
stop = await async_run(cli, on_log, name=name)
@@ -76,7 +64,9 @@ async def async_run_logs(config: dict[str, Any], addresses: list[str]) -> None:
await stop()
def run_logs(config: dict[str, Any], addresses: list[str]) -> None:
def run_logs(config: dict[str, Any], address: str) -> None:
"""Run the logs command."""
with contextlib.suppress(KeyboardInterrupt):
asyncio.run(async_run_logs(config, addresses))
try:
asyncio.run(async_run_logs(config, address))
except KeyboardInterrupt:
pass

View File

@@ -6,7 +6,8 @@
#ifdef USE_API_SERVICES
#include "user_services.h"
#endif
namespace esphome::api {
namespace esphome {
namespace api {
#ifdef USE_API_SERVICES
template<typename T, typename... Ts> class CustomAPIDeviceService : public UserServiceBase<Ts...> {
@@ -56,14 +57,6 @@ class CustomAPIDevice {
auto *service = new CustomAPIDeviceService<T, Ts...>(name, arg_names, (T *) this, callback); // NOLINT
global_api_server->register_user_service(service);
}
#else
template<typename T, typename... Ts>
void register_service(void (T::*callback)(Ts...), const std::string &name,
const std::array<std::string, sizeof...(Ts)> &arg_names) {
static_assert(
sizeof(T) == 0,
"register_service() requires 'custom_services: true' in the 'api:' section of your YAML configuration");
}
#endif
/** Register a custom native API service that will show up in Home Assistant.
@@ -89,15 +82,8 @@ class CustomAPIDevice {
auto *service = new CustomAPIDeviceService<T>(name, {}, (T *) this, callback); // NOLINT
global_api_server->register_user_service(service);
}
#else
template<typename T> void register_service(void (T::*callback)(), const std::string &name) {
static_assert(
sizeof(T) == 0,
"register_service() requires 'custom_services: true' in the 'api:' section of your YAML configuration");
}
#endif
#ifdef USE_API_HOMEASSISTANT_STATES
/** Subscribe to the state (or attribute state) of an entity from Home Assistant.
*
* Usage:
@@ -149,25 +135,7 @@ class CustomAPIDevice {
auto f = std::bind(callback, (T *) this, entity_id, std::placeholders::_1);
global_api_server->subscribe_home_assistant_state(entity_id, optional<std::string>(attribute), f);
}
#else
template<typename T>
void subscribe_homeassistant_state(void (T::*callback)(std::string), const std::string &entity_id,
const std::string &attribute = "") {
static_assert(sizeof(T) == 0,
"subscribe_homeassistant_state() requires 'homeassistant_states: true' in the 'api:' section "
"of your YAML configuration");
}
template<typename T>
void subscribe_homeassistant_state(void (T::*callback)(std::string, std::string), const std::string &entity_id,
const std::string &attribute = "") {
static_assert(sizeof(T) == 0,
"subscribe_homeassistant_state() requires 'homeassistant_states: true' in the 'api:' section "
"of your YAML configuration");
}
#endif
#ifdef USE_API_HOMEASSISTANT_SERVICES
/** Call a Home Assistant service from ESPHome.
*
* Usage:
@@ -179,9 +147,9 @@ class CustomAPIDevice {
* @param service_name The service to call.
*/
void call_homeassistant_service(const std::string &service_name) {
HomeassistantActionRequest resp;
resp.set_service(StringRef(service_name));
global_api_server->send_homeassistant_action(resp);
HomeassistantServiceResponse resp;
resp.service = service_name;
global_api_server->send_homeassistant_service_call(resp);
}
/** Call a Home Assistant service from ESPHome.
@@ -199,15 +167,15 @@ class CustomAPIDevice {
* @param data The data for the service call, mapping from string to string.
*/
void call_homeassistant_service(const std::string &service_name, const std::map<std::string, std::string> &data) {
HomeassistantActionRequest resp;
resp.set_service(StringRef(service_name));
HomeassistantServiceResponse resp;
resp.service = service_name;
for (auto &it : data) {
resp.data.emplace_back();
auto &kv = resp.data.back();
kv.set_key(StringRef(it.first));
HomeassistantServiceMap kv;
kv.key = it.first;
kv.value = it.second;
resp.data.push_back(kv);
}
global_api_server->send_homeassistant_action(resp);
global_api_server->send_homeassistant_service_call(resp);
}
/** Fire an ESPHome event in Home Assistant.
@@ -221,10 +189,10 @@ class CustomAPIDevice {
* @param event_name The event to fire.
*/
void fire_homeassistant_event(const std::string &event_name) {
HomeassistantActionRequest resp;
resp.set_service(StringRef(event_name));
HomeassistantServiceResponse resp;
resp.service = event_name;
resp.is_event = true;
global_api_server->send_homeassistant_action(resp);
global_api_server->send_homeassistant_service_call(resp);
}
/** Fire an ESPHome event in Home Assistant.
@@ -241,41 +209,19 @@ class CustomAPIDevice {
* @param data The data for the event, mapping from string to string.
*/
void fire_homeassistant_event(const std::string &service_name, const std::map<std::string, std::string> &data) {
HomeassistantActionRequest resp;
resp.set_service(StringRef(service_name));
HomeassistantServiceResponse resp;
resp.service = service_name;
resp.is_event = true;
for (auto &it : data) {
resp.data.emplace_back();
auto &kv = resp.data.back();
kv.set_key(StringRef(it.first));
HomeassistantServiceMap kv;
kv.key = it.first;
kv.value = it.second;
resp.data.push_back(kv);
}
global_api_server->send_homeassistant_action(resp);
global_api_server->send_homeassistant_service_call(resp);
}
#else
template<typename T = void> void call_homeassistant_service(const std::string &service_name) {
static_assert(sizeof(T) == 0, "call_homeassistant_service() requires 'homeassistant_services: true' in the 'api:' "
"section of your YAML configuration");
}
template<typename T = void>
void call_homeassistant_service(const std::string &service_name, const std::map<std::string, std::string> &data) {
static_assert(sizeof(T) == 0, "call_homeassistant_service() requires 'homeassistant_services: true' in the 'api:' "
"section of your YAML configuration");
}
template<typename T = void> void fire_homeassistant_event(const std::string &event_name) {
static_assert(sizeof(T) == 0, "fire_homeassistant_event() requires 'homeassistant_services: true' in the 'api:' "
"section of your YAML configuration");
}
template<typename T = void>
void fire_homeassistant_event(const std::string &service_name, const std::map<std::string, std::string> &data) {
static_assert(sizeof(T) == 0, "fire_homeassistant_event() requires 'homeassistant_services: true' in the 'api:' "
"section of your YAML configuration");
}
#endif
};
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

View File

@@ -2,18 +2,13 @@
#include "api_server.h"
#ifdef USE_API
#ifdef USE_API_HOMEASSISTANT_SERVICES
#include <functional>
#include <utility>
#include <vector>
#include "api_pb2.h"
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
#include "esphome/components/json/json_util.h"
#endif
#include "esphome/core/automation.h"
#include "esphome/core/helpers.h"
#include <vector>
namespace esphome::api {
namespace esphome {
namespace api {
template<typename... X> class TemplatableStringValue : public TemplatableValue<std::string, X...> {
private:
@@ -41,178 +36,61 @@ template<typename... X> class TemplatableStringValue : public TemplatableValue<s
template<typename... Ts> class TemplatableKeyValuePair {
public:
// Keys are always string literals from YAML dictionary keys (e.g., "code", "event")
// and never templatable values or lambdas. Only the value parameter can be a lambda/template.
// Using pass-by-value with std::move allows optimal performance for both lvalues and rvalues.
template<typename T> TemplatableKeyValuePair(std::string key, T value) : key(std::move(key)), value(value) {}
std::string key;
TemplatableStringValue<Ts...> value;
};
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
// Represents the response data from a Home Assistant action
class ActionResponse {
public:
ActionResponse(bool success, std::string error_message = "")
: success_(success), error_message_(std::move(error_message)) {}
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
ActionResponse(bool success, std::string error_message, const uint8_t *data, size_t data_len)
: success_(success), error_message_(std::move(error_message)) {
if (data == nullptr || data_len == 0)
return;
this->json_document_ = json::parse_json(data, data_len);
}
#endif
bool is_success() const { return this->success_; }
const std::string &get_error_message() const { return this->error_message_; }
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
// Get data as parsed JSON object (const version returns read-only view)
JsonObjectConst get_json() const { return this->json_document_.as<JsonObjectConst>(); }
#endif
protected:
bool success_;
std::string error_message_;
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
JsonDocument json_document_;
#endif
};
// Callback type for action responses
template<typename... Ts> using ActionResponseCallback = std::function<void(const ActionResponse &, Ts...)>;
#endif
template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts...> {
public:
explicit HomeAssistantServiceCallAction(APIServer *parent, bool is_event) : parent_(parent) {
this->flags_.is_event = is_event;
}
explicit HomeAssistantServiceCallAction(APIServer *parent, bool is_event) : parent_(parent), is_event_(is_event) {}
template<typename T> void set_service(T service) { this->service_ = service; }
// Keys are always string literals from the Python code generation (e.g., cg.add(var.add_data("tag_id", templ))).
// The value parameter can be a lambda/template, but keys are never templatable.
// Using pass-by-value allows the compiler to optimize for both lvalues and rvalues.
template<typename T> void add_data(std::string key, T value) { this->data_.emplace_back(std::move(key), value); }
template<typename T> void add_data(std::string key, T value) {
this->data_.push_back(TemplatableKeyValuePair<Ts...>(key, value));
}
template<typename T> void add_data_template(std::string key, T value) {
this->data_template_.emplace_back(std::move(key), value);
this->data_template_.push_back(TemplatableKeyValuePair<Ts...>(key, value));
}
template<typename T> void add_variable(std::string key, T value) {
this->variables_.emplace_back(std::move(key), value);
this->variables_.push_back(TemplatableKeyValuePair<Ts...>(key, value));
}
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
template<typename T> void set_response_template(T response_template) {
this->response_template_ = response_template;
this->flags_.has_response_template = true;
}
void set_wants_status() { this->flags_.wants_status = true; }
void set_wants_response() { this->flags_.wants_response = true; }
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
Trigger<JsonObjectConst, Ts...> *get_success_trigger_with_response() const {
return this->success_trigger_with_response_;
}
#endif
Trigger<Ts...> *get_success_trigger() const { return this->success_trigger_; }
Trigger<std::string, Ts...> *get_error_trigger() const { return this->error_trigger_; }
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
void play(Ts... x) override {
HomeassistantActionRequest resp;
std::string service_value = this->service_.value(x...);
resp.set_service(StringRef(service_value));
resp.is_event = this->flags_.is_event;
HomeassistantServiceResponse resp;
resp.service = this->service_.value(x...);
resp.is_event = this->is_event_;
for (auto &it : this->data_) {
resp.data.emplace_back();
auto &kv = resp.data.back();
kv.set_key(StringRef(it.key));
HomeassistantServiceMap kv;
kv.key = it.key;
kv.value = it.value.value(x...);
resp.data.push_back(kv);
}
for (auto &it : this->data_template_) {
resp.data_template.emplace_back();
auto &kv = resp.data_template.back();
kv.set_key(StringRef(it.key));
HomeassistantServiceMap kv;
kv.key = it.key;
kv.value = it.value.value(x...);
resp.data_template.push_back(kv);
}
for (auto &it : this->variables_) {
resp.variables.emplace_back();
auto &kv = resp.variables.back();
kv.set_key(StringRef(it.key));
HomeassistantServiceMap kv;
kv.key = it.key;
kv.value = it.value.value(x...);
resp.variables.push_back(kv);
}
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
if (this->flags_.wants_status) {
// Generate a unique call ID for this service call
static uint32_t call_id_counter = 1;
uint32_t call_id = call_id_counter++;
resp.call_id = call_id;
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
if (this->flags_.wants_response) {
resp.wants_response = true;
// Set response template if provided
if (this->flags_.has_response_template) {
std::string response_template_value = this->response_template_.value(x...);
resp.response_template = response_template_value;
}
}
#endif
auto captured_args = std::make_tuple(x...);
this->parent_->register_action_response_callback(call_id, [this, captured_args](const ActionResponse &response) {
std::apply(
[this, &response](auto &&...args) {
if (response.is_success()) {
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
if (this->flags_.wants_response) {
this->success_trigger_with_response_->trigger(response.get_json(), args...);
} else
#endif
{
this->success_trigger_->trigger(args...);
}
} else {
this->error_trigger_->trigger(response.get_error_message(), args...);
}
},
captured_args);
});
}
#endif
this->parent_->send_homeassistant_action(resp);
this->parent_->send_homeassistant_service_call(resp);
}
protected:
APIServer *parent_;
bool is_event_;
TemplatableStringValue<Ts...> service_{};
std::vector<TemplatableKeyValuePair<Ts...>> data_;
std::vector<TemplatableKeyValuePair<Ts...>> data_template_;
std::vector<TemplatableKeyValuePair<Ts...>> variables_;
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
TemplatableStringValue<Ts...> response_template_{""};
Trigger<JsonObjectConst, Ts...> *success_trigger_with_response_ = new Trigger<JsonObjectConst, Ts...>();
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
Trigger<Ts...> *success_trigger_ = new Trigger<Ts...>();
Trigger<std::string, Ts...> *error_trigger_ = new Trigger<std::string, Ts...>();
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
struct Flags {
uint8_t is_event : 1;
uint8_t wants_status : 1;
uint8_t wants_response : 1;
uint8_t has_response_template : 1;
uint8_t reserved : 5;
} flags_{0};
};
} // namespace esphome::api
#endif
} // namespace api
} // namespace esphome
#endif

View File

@@ -6,7 +6,8 @@
#include "esphome/core/log.h"
#include "esphome/core/util.h"
namespace esphome::api {
namespace esphome {
namespace api {
// Generate entity handler implementations using macros
#ifdef USE_BINARY_SENSOR
@@ -89,5 +90,6 @@ bool ListEntitiesIterator::on_service(UserServiceDescriptor *service) {
}
#endif
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

View File

@@ -4,7 +4,8 @@
#ifdef USE_API
#include "esphome/core/component.h"
#include "esphome/core/component_iterator.h"
namespace esphome::api {
namespace esphome {
namespace api {
class APIConnection;
@@ -95,5 +96,6 @@ class ListEntitiesIterator : public ComponentIterator {
APIConnection *client_;
};
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

View File

@@ -3,75 +3,80 @@
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
namespace esphome::api {
namespace esphome {
namespace api {
static const char *const TAG = "api.proto";
void ProtoDecodableMessage::decode(const uint8_t *buffer, size_t length) {
const uint8_t *ptr = buffer;
const uint8_t *end = buffer + length;
while (ptr < end) {
uint32_t i = 0;
bool error = false;
while (i < length) {
uint32_t consumed;
// Parse field header
auto res = ProtoVarInt::parse(ptr, end - ptr, &consumed);
auto res = ProtoVarInt::parse(&buffer[i], length - i, &consumed);
if (!res.has_value()) {
ESP_LOGV(TAG, "Invalid field start at offset %ld", (long) (ptr - buffer));
return;
ESP_LOGV(TAG, "Invalid field start at %" PRIu32, i);
break;
}
uint32_t tag = res->as_uint32();
uint32_t field_type = tag & 0b111;
uint32_t field_id = tag >> 3;
ptr += consumed;
uint32_t field_type = (res->as_uint32()) & 0b111;
uint32_t field_id = (res->as_uint32()) >> 3;
i += consumed;
switch (field_type) {
case 0: { // VarInt
res = ProtoVarInt::parse(ptr, end - ptr, &consumed);
res = ProtoVarInt::parse(&buffer[i], length - i, &consumed);
if (!res.has_value()) {
ESP_LOGV(TAG, "Invalid VarInt at offset %ld", (long) (ptr - buffer));
return;
ESP_LOGV(TAG, "Invalid VarInt at %" PRIu32, i);
error = true;
break;
}
if (!this->decode_varint(field_id, *res)) {
ESP_LOGV(TAG, "Cannot decode VarInt field %" PRIu32 " with value %" PRIu32 "!", field_id, res->as_uint32());
}
ptr += consumed;
i += consumed;
break;
}
case 2: { // Length-delimited
res = ProtoVarInt::parse(ptr, end - ptr, &consumed);
res = ProtoVarInt::parse(&buffer[i], length - i, &consumed);
if (!res.has_value()) {
ESP_LOGV(TAG, "Invalid Length Delimited at offset %ld", (long) (ptr - buffer));
return;
ESP_LOGV(TAG, "Invalid Length Delimited at %" PRIu32, i);
error = true;
break;
}
uint32_t field_length = res->as_uint32();
ptr += consumed;
if (ptr + field_length > end) {
ESP_LOGV(TAG, "Out-of-bounds Length Delimited at offset %ld", (long) (ptr - buffer));
return;
i += consumed;
if (field_length > length - i) {
ESP_LOGV(TAG, "Out-of-bounds Length Delimited at %" PRIu32, i);
error = true;
break;
}
if (!this->decode_length(field_id, ProtoLengthDelimited(ptr, field_length))) {
if (!this->decode_length(field_id, ProtoLengthDelimited(&buffer[i], field_length))) {
ESP_LOGV(TAG, "Cannot decode Length Delimited field %" PRIu32 "!", field_id);
}
ptr += field_length;
i += field_length;
break;
}
case 5: { // 32-bit
if (ptr + 4 > end) {
ESP_LOGV(TAG, "Out-of-bounds Fixed32-bit at offset %ld", (long) (ptr - buffer));
return;
if (length - i < 4) {
ESP_LOGV(TAG, "Out-of-bounds Fixed32-bit at %" PRIu32, i);
error = true;
break;
}
uint32_t val = encode_uint32(ptr[3], ptr[2], ptr[1], ptr[0]);
uint32_t val = encode_uint32(buffer[i + 3], buffer[i + 2], buffer[i + 1], buffer[i]);
if (!this->decode_32bit(field_id, Proto32Bit(val))) {
ESP_LOGV(TAG, "Cannot decode 32-bit field %" PRIu32 " with value %" PRIu32 "!", field_id, val);
}
ptr += 4;
i += 4;
break;
}
default:
ESP_LOGV(TAG, "Invalid field type %u at offset %ld", field_type, (long) (ptr - buffer));
return;
ESP_LOGV(TAG, "Invalid field type at %" PRIu32, i);
error = true;
break;
}
if (error) {
break;
}
}
}
@@ -84,4 +89,5 @@ std::string ProtoMessage::dump() const {
}
#endif
} // namespace esphome::api
} // namespace api
} // namespace esphome

View File

@@ -3,64 +3,16 @@
#include "esphome/core/component.h"
#include "esphome/core/helpers.h"
#include "esphome/core/log.h"
#include "esphome/core/string_ref.h"
#include <cassert>
#include <cstring>
#include <vector>
#ifdef ESPHOME_LOG_HAS_VERY_VERBOSE
#define HAS_PROTO_MESSAGE_DUMP
#endif
namespace esphome::api {
// Helper functions for ZigZag encoding/decoding
inline constexpr uint32_t encode_zigzag32(int32_t value) {
return (static_cast<uint32_t>(value) << 1) ^ (static_cast<uint32_t>(value >> 31));
}
inline constexpr uint64_t encode_zigzag64(int64_t value) {
return (static_cast<uint64_t>(value) << 1) ^ (static_cast<uint64_t>(value >> 63));
}
inline constexpr int32_t decode_zigzag32(uint32_t value) {
return (value & 1) ? static_cast<int32_t>(~(value >> 1)) : static_cast<int32_t>(value >> 1);
}
inline constexpr int64_t decode_zigzag64(uint64_t value) {
return (value & 1) ? static_cast<int64_t>(~(value >> 1)) : static_cast<int64_t>(value >> 1);
}
/*
* StringRef Ownership Model for API Protocol Messages
* ===================================================
*
* StringRef is used for zero-copy string handling in outgoing (SOURCE_SERVER) messages.
* It holds a pointer and length to existing string data without copying.
*
* CRITICAL: The referenced string data MUST remain valid until message encoding completes.
*
* Safe StringRef Patterns:
* 1. String literals: StringRef("literal") - Always safe (static storage duration)
* 2. Member variables: StringRef(this->member_string_) - Safe if object outlives encoding
* 3. Global/static strings: StringRef(GLOBAL_CONSTANT) - Always safe
* 4. Local variables: Safe ONLY if encoding happens before function returns:
* std::string temp = compute_value();
* msg.set_field(StringRef(temp));
* return this->send_message(msg); // temp is valid during encoding
*
* Unsafe Patterns (WILL cause crashes/corruption):
* 1. Temporaries: msg.set_field(StringRef(obj.get_string())) // get_string() returns by value
* 2. Concatenation: msg.set_field(StringRef(str1 + str2)) // Result is temporary
*
* For unsafe patterns, store in a local variable first:
* std::string temp = get_string(); // or str1 + str2
* msg.set_field(StringRef(temp));
*
* The send_*_response pattern ensures proper lifetime management by encoding
* within the same function scope where temporaries are created.
*/
namespace esphome {
namespace api {
/// Representation of a VarInt - in ProtoBuf should be 64bit but we only use 32bit
class ProtoVarInt {
@@ -104,25 +56,33 @@ class ProtoVarInt {
return {}; // Incomplete or invalid varint
}
constexpr uint16_t as_uint16() const { return this->value_; }
constexpr uint32_t as_uint32() const { return this->value_; }
constexpr uint64_t as_uint64() const { return this->value_; }
constexpr bool as_bool() const { return this->value_; }
constexpr int32_t as_int32() const {
uint16_t as_uint16() const { return this->value_; }
uint32_t as_uint32() const { return this->value_; }
uint64_t as_uint64() const { return this->value_; }
bool as_bool() const { return this->value_; }
int32_t as_int32() const {
// Not ZigZag encoded
return static_cast<int32_t>(this->as_int64());
}
constexpr int64_t as_int64() const {
int64_t as_int64() const {
// Not ZigZag encoded
return static_cast<int64_t>(this->value_);
}
constexpr int32_t as_sint32() const {
int32_t as_sint32() const {
// with ZigZag encoding
return decode_zigzag32(static_cast<uint32_t>(this->value_));
if (this->value_ & 1) {
return static_cast<int32_t>(~(this->value_ >> 1));
} else {
return static_cast<int32_t>(this->value_ >> 1);
}
}
constexpr int64_t as_sint64() const {
int64_t as_sint64() const {
// with ZigZag encoding
return decode_zigzag64(this->value_);
if (this->value_ & 1) {
return static_cast<int64_t>(~(this->value_ >> 1));
} else {
return static_cast<int64_t>(this->value_ >> 1);
}
}
/**
* Encode the varint value to a pre-allocated buffer without bounds checking.
@@ -182,10 +142,6 @@ class ProtoLengthDelimited {
explicit ProtoLengthDelimited(const uint8_t *value, size_t length) : value_(value), length_(length) {}
std::string as_string() const { return std::string(reinterpret_cast<const char *>(this->value_), this->length_); }
// Direct access to raw data without string allocation
const uint8_t *data() const { return this->value_; }
size_t size() const { return this->length_; }
/**
* Decode the length-delimited data into an existing ProtoDecodableMessage instance.
*
@@ -250,20 +206,12 @@ class ProtoWriteBuffer {
this->encode_field_raw(field_id, 2); // type 2: Length-delimited string
this->encode_varint_raw(len);
// Using resize + memcpy instead of insert provides significant performance improvement:
// ~10-11x faster for 16-32 byte strings, ~3x faster for 64-byte strings
// as it avoids iterator checks and potential element moves that insert performs
size_t old_size = this->buffer_->size();
this->buffer_->resize(old_size + len);
std::memcpy(this->buffer_->data() + old_size, string, len);
auto *data = reinterpret_cast<const uint8_t *>(string);
this->buffer_->insert(this->buffer_->end(), data, data + len);
}
void encode_string(uint32_t field_id, const std::string &value, bool force = false) {
this->encode_string(field_id, value.data(), value.size(), force);
}
void encode_string(uint32_t field_id, const StringRef &ref, bool force = false) {
this->encode_string(field_id, ref.c_str(), ref.size(), force);
}
void encode_bytes(uint32_t field_id, const uint8_t *data, size_t len, bool force = false) {
this->encode_string(field_id, reinterpret_cast<const char *>(data), len, force);
}
@@ -322,10 +270,22 @@ class ProtoWriteBuffer {
this->encode_uint64(field_id, static_cast<uint64_t>(value), force);
}
void encode_sint32(uint32_t field_id, int32_t value, bool force = false) {
this->encode_uint32(field_id, encode_zigzag32(value), force);
uint32_t uvalue;
if (value < 0) {
uvalue = ~(value << 1);
} else {
uvalue = value << 1;
}
this->encode_uint32(field_id, uvalue, force);
}
void encode_sint64(uint32_t field_id, int64_t value, bool force = false) {
this->encode_uint64(field_id, encode_zigzag64(value), force);
uint64_t uvalue;
if (value < 0) {
uvalue = ~(value << 1);
} else {
uvalue = value << 1;
}
this->encode_uint64(field_id, uvalue, force);
}
void encode_message(uint32_t field_id, const ProtoMessage &value, bool force = false);
std::vector<uint8_t> *get_buffer() const { return buffer_; }
@@ -334,16 +294,13 @@ class ProtoWriteBuffer {
std::vector<uint8_t> *buffer_;
};
// Forward declaration
class ProtoSize;
class ProtoMessage {
public:
virtual ~ProtoMessage() = default;
// Default implementation for messages with no fields
virtual void encode(ProtoWriteBuffer buffer) const {}
// Default implementation for messages with no fields
virtual void calculate_size(ProtoSize &size) const {}
virtual void calculate_size(uint32_t &total_size) const {}
#ifdef HAS_PROTO_MESSAGE_DUMP
std::string dump() const;
virtual void dump_to(std::string &out) const = 0;
@@ -364,39 +321,31 @@ class ProtoDecodableMessage : public ProtoMessage {
};
class ProtoSize {
private:
uint32_t total_size_ = 0;
public:
/**
* @brief ProtoSize class for Protocol Buffer serialization size calculation
*
* This class provides methods to calculate the exact byte counts needed
* for encoding various Protocol Buffer field types. The class now uses an
* object-based approach to reduce parameter passing overhead while keeping
* varint calculation methods static for external use.
* This class provides static methods to calculate the exact byte counts needed
* for encoding various Protocol Buffer field types. All methods are designed to be
* efficient for the common case where many fields have default values.
*
* Implements Protocol Buffer encoding size calculation according to:
* https://protobuf.dev/programming-guides/encoding/
*
* Key features:
* - Object-based approach reduces flash usage by eliminating parameter passing
* - Early-return optimization for zero/default values
* - Static varint methods for external callers
* - Direct total_size updates to avoid unnecessary additions
* - Specialized handling for different field types according to protobuf spec
* - Templated helpers for repeated fields and messages
*/
ProtoSize() = default;
uint32_t get_size() const { return total_size_; }
/**
* @brief Calculates the size in bytes needed to encode a uint32_t value as a varint
*
* @param value The uint32_t value to calculate size for
* @return The number of bytes needed to encode the value
*/
static constexpr uint32_t varint(uint32_t value) {
static inline uint32_t varint(uint32_t value) {
// Optimized varint size calculation using leading zeros
// Each 7 bits requires one byte in the varint encoding
if (value < 128)
@@ -420,7 +369,7 @@ class ProtoSize {
* @param value The uint64_t value to calculate size for
* @return The number of bytes needed to encode the value
*/
static constexpr uint32_t varint(uint64_t value) {
static inline uint32_t varint(uint64_t value) {
// Handle common case of values fitting in uint32_t (vast majority of use cases)
if (value <= UINT32_MAX) {
return varint(static_cast<uint32_t>(value));
@@ -451,7 +400,7 @@ class ProtoSize {
* @param value The int32_t value to calculate size for
* @return The number of bytes needed to encode the value
*/
static constexpr uint32_t varint(int32_t value) {
static inline uint32_t varint(int32_t value) {
// Negative values are sign-extended to 64 bits in protocol buffers,
// which always results in a 10-byte varint for negative int32
if (value < 0) {
@@ -467,7 +416,7 @@ class ProtoSize {
* @param value The int64_t value to calculate size for
* @return The number of bytes needed to encode the value
*/
static constexpr uint32_t varint(int64_t value) {
static inline uint32_t varint(int64_t value) {
// For int64_t, we convert to uint64_t and calculate the size
// This works because the bit pattern determines the encoding size,
// and we've handled negative int32 values as a special case above
@@ -481,7 +430,7 @@ class ProtoSize {
* @param type The wire type value (from the WireType enum in the protobuf spec)
* @return The number of bytes needed to encode the field ID and wire type
*/
static constexpr uint32_t field(uint32_t field_id, uint32_t type) {
static inline uint32_t field(uint32_t field_id, uint32_t type) {
uint32_t tag = (field_id << 3) | (type & 0b111);
return varint(tag);
}
@@ -490,7 +439,9 @@ class ProtoSize {
* @brief Common parameters for all add_*_field methods
*
* All add_*_field methods follow these common patterns:
* * @param field_id_size Pre-calculated size of the field ID in bytes
*
* @param total_size Reference to the total message size to update
* @param field_id_size Pre-calculated size of the field ID in bytes
* @param value The value to calculate size for (type varies)
* @param force Whether to calculate size even if the value is default/zero/empty
*
@@ -503,63 +454,104 @@ class ProtoSize {
/**
* @brief Calculates and adds the size of an int32 field to the total message size
*/
inline void add_int32(uint32_t field_id_size, int32_t value) {
if (value != 0) {
add_int32_force(field_id_size, value);
static inline void add_int32_field(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
// Skip calculation if value is zero
if (value == 0) {
return; // No need to update total_size
}
// Calculate and directly add to total_size
if (value < 0) {
// Negative values are encoded as 10-byte varints in protobuf
total_size += field_id_size + 10;
} else {
// For non-negative values, use the standard varint size
total_size += field_id_size + varint(static_cast<uint32_t>(value));
}
}
/**
* @brief Calculates and adds the size of an int32 field to the total message size (force version)
* @brief Calculates and adds the size of an int32 field to the total message size (repeated field version)
*/
inline void add_int32_force(uint32_t field_id_size, int32_t value) {
// Always calculate size when forced
// Negative values are encoded as 10-byte varints in protobuf
total_size_ += field_id_size + (value < 0 ? 10 : varint(static_cast<uint32_t>(value)));
static inline void add_int32_field_repeated(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
// Always calculate size for repeated fields
if (value < 0) {
// Negative values are encoded as 10-byte varints in protobuf
total_size += field_id_size + 10;
} else {
// For non-negative values, use the standard varint size
total_size += field_id_size + varint(static_cast<uint32_t>(value));
}
}
/**
* @brief Calculates and adds the size of a uint32 field to the total message size
*/
inline void add_uint32(uint32_t field_id_size, uint32_t value) {
if (value != 0) {
add_uint32_force(field_id_size, value);
static inline void add_uint32_field(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
// Skip calculation if value is zero
if (value == 0) {
return; // No need to update total_size
}
// Calculate and directly add to total_size
total_size += field_id_size + varint(value);
}
/**
* @brief Calculates and adds the size of a uint32 field to the total message size (force version)
* @brief Calculates and adds the size of a uint32 field to the total message size (repeated field version)
*/
inline void add_uint32_force(uint32_t field_id_size, uint32_t value) {
// Always calculate size when force is true
total_size_ += field_id_size + varint(value);
static inline void add_uint32_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
// Always calculate size for repeated fields
total_size += field_id_size + varint(value);
}
/**
* @brief Calculates and adds the size of a boolean field to the total message size
*/
inline void add_bool(uint32_t field_id_size, bool value) {
if (value) {
// Boolean fields always use 1 byte when true
total_size_ += field_id_size + 1;
static inline void add_bool_field(uint32_t &total_size, uint32_t field_id_size, bool value) {
// Skip calculation if value is false
if (!value) {
return; // No need to update total_size
}
// Boolean fields always use 1 byte when true
total_size += field_id_size + 1;
}
/**
* @brief Calculates and adds the size of a boolean field to the total message size (force version)
* @brief Calculates and adds the size of a boolean field to the total message size (repeated field version)
*/
inline void add_bool_force(uint32_t field_id_size, bool value) {
// Always calculate size when force is true
static inline void add_bool_field_repeated(uint32_t &total_size, uint32_t field_id_size, bool value) {
// Always calculate size for repeated fields
// Boolean fields always use 1 byte
total_size_ += field_id_size + 1;
total_size += field_id_size + 1;
}
/**
* @brief Calculates and adds the size of a fixed field to the total message size
*
* Fixed fields always take exactly N bytes (4 for fixed32/float, 8 for fixed64/double).
*
* @tparam NumBytes The number of bytes for this fixed field (4 or 8)
* @param is_nonzero Whether the value is non-zero
*/
template<uint32_t NumBytes>
static inline void add_fixed_field(uint32_t &total_size, uint32_t field_id_size, bool is_nonzero) {
// Skip calculation if value is zero
if (!is_nonzero) {
return; // No need to update total_size
}
// Fixed fields always take exactly NumBytes
total_size += field_id_size + NumBytes;
}
/**
* @brief Calculates and adds the size of a float field to the total message size
*/
inline void add_float(uint32_t field_id_size, float value) {
static inline void add_float_field(uint32_t &total_size, uint32_t field_id_size, float value) {
if (value != 0.0f) {
total_size_ += field_id_size + 4;
total_size += field_id_size + 4;
}
}
@@ -569,9 +561,9 @@ class ProtoSize {
/**
* @brief Calculates and adds the size of a fixed32 field to the total message size
*/
inline void add_fixed32(uint32_t field_id_size, uint32_t value) {
static inline void add_fixed32_field(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
if (value != 0) {
total_size_ += field_id_size + 4;
total_size += field_id_size + 4;
}
}
@@ -581,103 +573,137 @@ class ProtoSize {
/**
* @brief Calculates and adds the size of a sfixed32 field to the total message size
*/
inline void add_sfixed32(uint32_t field_id_size, int32_t value) {
static inline void add_sfixed32_field(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
if (value != 0) {
total_size_ += field_id_size + 4;
total_size += field_id_size + 4;
}
}
// NOTE: add_sfixed64_field removed - wire type 1 (64-bit: sfixed64) not supported
// to reduce overhead on embedded systems
/**
* @brief Calculates and adds the size of an enum field to the total message size
*
* Enum fields are encoded as uint32 varints.
*/
static inline void add_enum_field(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
// Skip calculation if value is zero
if (value == 0) {
return; // No need to update total_size
}
// Enums are encoded as uint32
total_size += field_id_size + varint(value);
}
/**
* @brief Calculates and adds the size of an enum field to the total message size (repeated field version)
*
* Enum fields are encoded as uint32 varints.
*/
static inline void add_enum_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint32_t value) {
// Always calculate size for repeated fields
// Enums are encoded as uint32
total_size += field_id_size + varint(value);
}
/**
* @brief Calculates and adds the size of a sint32 field to the total message size
*
* Sint32 fields use ZigZag encoding, which is more efficient for negative values.
*/
inline void add_sint32(uint32_t field_id_size, int32_t value) {
if (value != 0) {
add_sint32_force(field_id_size, value);
static inline void add_sint32_field(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
// Skip calculation if value is zero
if (value == 0) {
return; // No need to update total_size
}
// ZigZag encoding for sint32: (n << 1) ^ (n >> 31)
uint32_t zigzag = (static_cast<uint32_t>(value) << 1) ^ (static_cast<uint32_t>(value >> 31));
total_size += field_id_size + varint(zigzag);
}
/**
* @brief Calculates and adds the size of a sint32 field to the total message size (force version)
* @brief Calculates and adds the size of a sint32 field to the total message size (repeated field version)
*
* Sint32 fields use ZigZag encoding, which is more efficient for negative values.
*/
inline void add_sint32_force(uint32_t field_id_size, int32_t value) {
// Always calculate size when force is true
// ZigZag encoding for sint32
total_size_ += field_id_size + varint(encode_zigzag32(value));
static inline void add_sint32_field_repeated(uint32_t &total_size, uint32_t field_id_size, int32_t value) {
// Always calculate size for repeated fields
// ZigZag encoding for sint32: (n << 1) ^ (n >> 31)
uint32_t zigzag = (static_cast<uint32_t>(value) << 1) ^ (static_cast<uint32_t>(value >> 31));
total_size += field_id_size + varint(zigzag);
}
/**
* @brief Calculates and adds the size of an int64 field to the total message size
*/
inline void add_int64(uint32_t field_id_size, int64_t value) {
if (value != 0) {
add_int64_force(field_id_size, value);
static inline void add_int64_field(uint32_t &total_size, uint32_t field_id_size, int64_t value) {
// Skip calculation if value is zero
if (value == 0) {
return; // No need to update total_size
}
// Calculate and directly add to total_size
total_size += field_id_size + varint(value);
}
/**
* @brief Calculates and adds the size of an int64 field to the total message size (force version)
* @brief Calculates and adds the size of an int64 field to the total message size (repeated field version)
*/
inline void add_int64_force(uint32_t field_id_size, int64_t value) {
// Always calculate size when force is true
total_size_ += field_id_size + varint(value);
static inline void add_int64_field_repeated(uint32_t &total_size, uint32_t field_id_size, int64_t value) {
// Always calculate size for repeated fields
total_size += field_id_size + varint(value);
}
/**
* @brief Calculates and adds the size of a uint64 field to the total message size
*/
inline void add_uint64(uint32_t field_id_size, uint64_t value) {
if (value != 0) {
add_uint64_force(field_id_size, value);
static inline void add_uint64_field(uint32_t &total_size, uint32_t field_id_size, uint64_t value) {
// Skip calculation if value is zero
if (value == 0) {
return; // No need to update total_size
}
// Calculate and directly add to total_size
total_size += field_id_size + varint(value);
}
/**
* @brief Calculates and adds the size of a uint64 field to the total message size (force version)
* @brief Calculates and adds the size of a uint64 field to the total message size (repeated field version)
*/
inline void add_uint64_force(uint32_t field_id_size, uint64_t value) {
// Always calculate size when force is true
total_size_ += field_id_size + varint(value);
static inline void add_uint64_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint64_t value) {
// Always calculate size for repeated fields
total_size += field_id_size + varint(value);
}
// NOTE: sint64 support functions (add_sint64_field, add_sint64_field_force) removed
// NOTE: sint64 support functions (add_sint64_field, add_sint64_field_repeated) removed
// sint64 type is not supported by ESPHome API to reduce overhead on embedded systems
/**
* @brief Calculates and adds the size of a length-delimited field (string/bytes) to the total message size
* @brief Calculates and adds the size of a string/bytes field to the total message size
*/
inline void add_length(uint32_t field_id_size, size_t len) {
if (len != 0) {
add_length_force(field_id_size, len);
static inline void add_string_field(uint32_t &total_size, uint32_t field_id_size, const std::string &str) {
// Skip calculation if string is empty
if (str.empty()) {
return; // No need to update total_size
}
// Calculate and directly add to total_size
const uint32_t str_size = static_cast<uint32_t>(str.size());
total_size += field_id_size + varint(str_size) + str_size;
}
/**
* @brief Calculates and adds the size of a length-delimited field (string/bytes) to the total message size (repeated
* field version)
* @brief Calculates and adds the size of a string/bytes field to the total message size (repeated field version)
*/
inline void add_length_force(uint32_t field_id_size, size_t len) {
// Always calculate size when force is true
// Field ID + length varint + data bytes
total_size_ += field_id_size + varint(static_cast<uint32_t>(len)) + static_cast<uint32_t>(len);
static inline void add_string_field_repeated(uint32_t &total_size, uint32_t field_id_size, const std::string &str) {
// Always calculate size for repeated fields
const uint32_t str_size = static_cast<uint32_t>(str.size());
total_size += field_id_size + varint(str_size) + str_size;
}
/**
* @brief Adds a pre-calculated size directly to the total
*
* This is used when we can calculate the total size by multiplying the number
* of elements by the bytes per element (for repeated fixed-size types like float, fixed32, etc.)
*
* @param size The pre-calculated total size to add
*/
inline void add_precalculated_size(uint32_t size) { total_size_ += size; }
/**
* @brief Calculates and adds the size of a nested message field to the total message size
*
@@ -686,21 +712,26 @@ class ProtoSize {
*
* @param nested_size The pre-calculated size of the nested message
*/
inline void add_message_field(uint32_t field_id_size, uint32_t nested_size) {
if (nested_size != 0) {
add_message_field_force(field_id_size, nested_size);
static inline void add_message_field(uint32_t &total_size, uint32_t field_id_size, uint32_t nested_size) {
// Skip calculation if nested message is empty
if (nested_size == 0) {
return; // No need to update total_size
}
// Calculate and directly add to total_size
// Field ID + length varint + nested message content
total_size += field_id_size + varint(nested_size) + nested_size;
}
/**
* @brief Calculates and adds the size of a nested message field to the total message size (force version)
* @brief Calculates and adds the size of a nested message field to the total message size (repeated field version)
*
* @param nested_size The pre-calculated size of the nested message
*/
inline void add_message_field_force(uint32_t field_id_size, uint32_t nested_size) {
// Always calculate size when force is true
static inline void add_message_field_repeated(uint32_t &total_size, uint32_t field_id_size, uint32_t nested_size) {
// Always calculate size for repeated fields
// Field ID + length varint + nested message content
total_size_ += field_id_size + varint(nested_size) + nested_size;
total_size += field_id_size + varint(nested_size) + nested_size;
}
/**
@@ -712,29 +743,26 @@ class ProtoSize {
*
* @param message The nested message object
*/
inline void add_message_object(uint32_t field_id_size, const ProtoMessage &message) {
// Calculate nested message size by creating a temporary ProtoSize
ProtoSize nested_calc;
message.calculate_size(nested_calc);
uint32_t nested_size = nested_calc.get_size();
static inline void add_message_object(uint32_t &total_size, uint32_t field_id_size, const ProtoMessage &message) {
uint32_t nested_size = 0;
message.calculate_size(nested_size);
// Use the base implementation with the calculated nested_size
add_message_field(field_id_size, nested_size);
add_message_field(total_size, field_id_size, nested_size);
}
/**
* @brief Calculates and adds the size of a nested message field to the total message size (force version)
* @brief Calculates and adds the size of a nested message field to the total message size (repeated field version)
*
* @param message The nested message object
*/
inline void add_message_object_force(uint32_t field_id_size, const ProtoMessage &message) {
// Calculate nested message size by creating a temporary ProtoSize
ProtoSize nested_calc;
message.calculate_size(nested_calc);
uint32_t nested_size = nested_calc.get_size();
static inline void add_message_object_repeated(uint32_t &total_size, uint32_t field_id_size,
const ProtoMessage &message) {
uint32_t nested_size = 0;
message.calculate_size(nested_size);
// Use the base implementation with the calculated nested_size
add_message_field_force(field_id_size, nested_size);
add_message_field_repeated(total_size, field_id_size, nested_size);
}
/**
@@ -747,15 +775,16 @@ class ProtoSize {
* @param messages Vector of message objects
*/
template<typename MessageType>
inline void add_repeated_message(uint32_t field_id_size, const std::vector<MessageType> &messages) {
static inline void add_repeated_message(uint32_t &total_size, uint32_t field_id_size,
const std::vector<MessageType> &messages) {
// Skip if the vector is empty
if (messages.empty()) {
return;
}
// Use the force version for all messages in the repeated field
// Use the repeated field version for all messages
for (const auto &message : messages) {
add_message_object_force(field_id_size, message);
add_message_object_repeated(total_size, field_id_size, message);
}
}
};
@@ -765,9 +794,8 @@ inline void ProtoWriteBuffer::encode_message(uint32_t field_id, const ProtoMessa
this->encode_field_raw(field_id, 2); // type 2: Length-delimited message
// Calculate the message size first
ProtoSize msg_size;
value.calculate_size(msg_size);
uint32_t msg_length_bytes = msg_size.get_size();
uint32_t msg_length_bytes = 0;
value.calculate_size(msg_length_bytes);
// Calculate how many bytes the length varint needs
uint32_t varint_length_bytes = ProtoSize::varint(msg_length_bytes);
@@ -799,9 +827,7 @@ class ProtoService {
virtual bool is_authenticated() = 0;
virtual bool is_connection_setup() = 0;
virtual void on_fatal_error() = 0;
#ifdef USE_API_PASSWORD
virtual void on_unauthenticated_access() = 0;
#endif
virtual void on_no_setup_connection() = 0;
/**
* Create a buffer with a reserved size.
@@ -816,9 +842,8 @@ class ProtoService {
// Optimized method that pre-allocates buffer based on message size
bool send_message_(const ProtoMessage &msg, uint8_t message_type) {
ProtoSize size;
msg.calculate_size(size);
uint32_t msg_size = size.get_size();
uint32_t msg_size = 0;
msg.calculate_size(msg_size);
// Create a pre-sized buffer
auto buffer = this->create_buffer(msg_size);
@@ -831,7 +856,7 @@ class ProtoService {
}
// Authentication helper methods
inline bool check_connection_setup_() {
bool check_connection_setup_() {
if (!this->is_connection_setup()) {
this->on_no_setup_connection();
return false;
@@ -839,8 +864,7 @@ class ProtoService {
return true;
}
inline bool check_authenticated_() {
#ifdef USE_API_PASSWORD
bool check_authenticated_() {
if (!this->check_connection_setup_()) {
return false;
}
@@ -849,10 +873,8 @@ class ProtoService {
return false;
}
return true;
#else
return this->check_connection_setup_();
#endif
}
};
} // namespace esphome::api
} // namespace api
} // namespace esphome

View File

@@ -3,7 +3,8 @@
#include "api_connection.h"
#include "esphome/core/log.h"
namespace esphome::api {
namespace esphome {
namespace api {
// Generate entity handler implementations using macros
#ifdef USE_BINARY_SENSOR
@@ -68,5 +69,6 @@ INITIAL_STATE_HANDLER(update, update::UpdateEntity)
InitialStateIterator::InitialStateIterator(APIConnection *client) : client_(client) {}
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

View File

@@ -5,7 +5,8 @@
#include "esphome/core/component.h"
#include "esphome/core/component_iterator.h"
#include "esphome/core/controller.h"
namespace esphome::api {
namespace esphome {
namespace api {
class APIConnection;
@@ -88,5 +89,6 @@ class InitialStateIterator : public ComponentIterator {
APIConnection *client_;
};
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif

View File

@@ -1,7 +1,8 @@
#include "user_services.h"
#include "esphome/core/log.h"
namespace esphome::api {
namespace esphome {
namespace api {
template<> bool get_execute_arg_value<bool>(const ExecuteServiceArgument &arg) { return arg.bool_; }
template<> int32_t get_execute_arg_value<int32_t>(const ExecuteServiceArgument &arg) {
@@ -39,4 +40,5 @@ template<> enums::ServiceArgType to_service_arg_type<std::vector<std::string>>()
return enums::SERVICE_ARG_TYPE_STRING_ARRAY;
}
} // namespace esphome::api
} // namespace api
} // namespace esphome

View File

@@ -8,7 +8,8 @@
#include "api_pb2.h"
#ifdef USE_API_SERVICES
namespace esphome::api {
namespace esphome {
namespace api {
class UserServiceDescriptor {
public:
@@ -32,14 +33,14 @@ template<typename... Ts> class UserServiceBase : public UserServiceDescriptor {
ListEntitiesServicesResponse encode_list_service_response() override {
ListEntitiesServicesResponse msg;
msg.set_name(StringRef(this->name_));
msg.name = this->name_;
msg.key = this->key_;
std::array<enums::ServiceArgType, sizeof...(Ts)> arg_types = {to_service_arg_type<Ts>()...};
for (size_t i = 0; i < sizeof...(Ts); i++) {
msg.args.emplace_back();
auto &arg = msg.args.back();
for (int i = 0; i < sizeof...(Ts); i++) {
ListEntitiesServicesArgument arg;
arg.type = arg_types[i];
arg.set_name(StringRef(this->arg_names_[i]));
arg.name = this->arg_names_[i];
msg.args.push_back(arg);
}
return msg;
}
@@ -55,7 +56,7 @@ template<typename... Ts> class UserServiceBase : public UserServiceDescriptor {
protected:
virtual void execute(Ts... x) = 0;
template<int... S> void execute_(const std::vector<ExecuteServiceArgument> &args, seq<S...> type) {
template<int... S> void execute_(std::vector<ExecuteServiceArgument> args, seq<S...> type) {
this->execute((get_execute_arg_value<Ts>(args[S]))...);
}
@@ -73,5 +74,6 @@ template<typename... Ts> class UserServiceTrigger : public UserServiceBase<Ts...
void execute(Ts... x) override { this->trigger(x...); } // NOLINT
};
} // namespace esphome::api
} // namespace api
} // namespace esphome
#endif // USE_API_SERVICES

View File

@@ -7,6 +7,8 @@ namespace as3935 {
static const char *const TAG = "as3935";
void AS3935Component::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
this->irq_pin_->setup();
LOG_PIN(" IRQ Pin: ", this->irq_pin_);

View File

@@ -7,7 +7,9 @@ namespace as3935_spi {
static const char *const TAG = "as3935_spi";
void SPIAS3935Component::setup() {
ESP_LOGI(TAG, "SPIAS3935Component setup started!");
this->spi_setup();
ESP_LOGI(TAG, "SPI setup finished!");
AS3935Component::setup();
}

View File

@@ -7,7 +7,6 @@ from esphome.const import (
CONF_DIRECTION,
CONF_HYSTERESIS,
CONF_ID,
CONF_POWER_MODE,
CONF_RANGE,
)
@@ -58,6 +57,7 @@ FAST_FILTER = {
CONF_RAW_ANGLE = "raw_angle"
CONF_RAW_POSITION = "raw_position"
CONF_WATCHDOG = "watchdog"
CONF_POWER_MODE = "power_mode"
CONF_SLOW_FILTER = "slow_filter"
CONF_FAST_FILTER = "fast_filter"
CONF_START_POSITION = "start_position"

View File

@@ -23,6 +23,8 @@ static const uint8_t REGISTER_AGC = 0x1A; // 8 bytes / R
static const uint8_t REGISTER_MAGNITUDE = 0x1B; // 16 bytes / R
void AS5600Component::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
if (!this->read_byte(REGISTER_STATUS).has_value()) {
this->mark_failed();
return;

View File

@@ -24,6 +24,7 @@ AS5600Sensor = as5600_ns.class_("AS5600Sensor", sensor.Sensor, cg.PollingCompone
CONF_RAW_ANGLE = "raw_angle"
CONF_RAW_POSITION = "raw_position"
CONF_WATCHDOG = "watchdog"
CONF_POWER_MODE = "power_mode"
CONF_SLOW_FILTER = "slow_filter"
CONF_FAST_FILTER = "fast_filter"
CONF_PWM_FREQUENCY = "pwm_frequency"

View File

@@ -8,6 +8,7 @@ namespace as7341 {
static const char *const TAG = "as7341";
void AS7341Component::setup() {
ESP_LOGCONFIG(TAG, "Running setup");
LOG_I2C_DEVICE(this);
// Verify device ID

View File

@@ -2,7 +2,6 @@ import esphome.codegen as cg
from esphome.components import i2c, sensor
import esphome.config_validation as cv
from esphome.const import (
CONF_CLEAR,
CONF_GAIN,
CONF_ID,
DEVICE_CLASS_ILLUMINANCE,
@@ -30,6 +29,7 @@ CONF_F5 = "f5"
CONF_F6 = "f6"
CONF_F7 = "f7"
CONF_F8 = "f8"
CONF_CLEAR = "clear"
CONF_NIR = "nir"
UNIT_COUNTS = "#"

View File

@@ -8,9 +8,9 @@ from esphome.const import (
PLATFORM_LN882X,
PLATFORM_RTL87XX,
)
from esphome.core import CORE, CoroPriority, coroutine_with_priority
from esphome.core import CORE, coroutine_with_priority
CODEOWNERS = ["@esphome/core"]
CODEOWNERS = ["@OttoWinter"]
CONFIG_SCHEMA = cv.All(
cv.Schema({}),
@@ -27,7 +27,7 @@ CONFIG_SCHEMA = cv.All(
)
@coroutine_with_priority(CoroPriority.NETWORK_TRANSPORT)
@coroutine_with_priority(200.0)
async def to_code(config):
if CORE.is_esp32 or CORE.is_libretiny:
# https://github.com/ESP32Async/AsyncTCP

View File

@@ -71,7 +71,7 @@ bool AT581XComponent::i2c_read_reg(uint8_t addr, uint8_t &data) {
return this->read_register(addr, &data, 1) == esphome::i2c::NO_ERROR;
}
void AT581XComponent::setup() {}
void AT581XComponent::setup() { ESP_LOGCONFIG(TAG, "Running setup"); }
void AT581XComponent::dump_config() { LOG_I2C_DEVICE(this); }
#define ARRAY_SIZE(X) (sizeof(X) / sizeof((X)[0]))
bool AT581XComponent::i2c_write_config() {

Some files were not shown because too many files have changed in this diff Show More