mirror of
https://github.com/esphome/esphome.git
synced 2026-02-08 16:51:52 +00:00
Compare commits
4 Commits
integratio
...
choose-mdn
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
550aa7300c | ||
|
|
b48a185612 | ||
|
|
16fad85e60 | ||
|
|
b32c785d7d |
@@ -293,12 +293,6 @@ This document provides essential context for AI models interacting with this pro
|
||||
* **Configuration Design:** Aim for simplicity with sensible defaults, while allowing for advanced customization.
|
||||
* **Embedded Systems Optimization:** ESPHome targets resource-constrained microcontrollers. Be mindful of flash size and RAM usage.
|
||||
|
||||
**Why Heap Allocation Matters:**
|
||||
|
||||
ESP devices run for months with small heaps shared between Wi-Fi, BLE, LWIP, and application code. Over time, repeated allocations of different sizes fragment the heap. Failures happen when the largest contiguous block shrinks, even if total free heap is still large. We have seen field crashes caused by this.
|
||||
|
||||
**Heap allocation after `setup()` should be avoided unless absolutely unavoidable.** Every allocation/deallocation cycle contributes to fragmentation. ESPHome treats runtime heap allocation as a long-term reliability bug, not a performance issue. Helpers that hide allocation (`std::string`, `std::to_string`, string-returning helpers) are being deprecated and replaced with buffer and view based APIs.
|
||||
|
||||
**STL Container Guidelines:**
|
||||
|
||||
ESPHome runs on embedded systems with limited resources. Choose containers carefully:
|
||||
@@ -328,15 +322,15 @@ This document provides essential context for AI models interacting with this pro
|
||||
std::array<uint8_t, 256> buffer;
|
||||
```
|
||||
|
||||
2. **Compile-time-known fixed sizes with vector-like API:** Use `StaticVector` from `esphome/core/helpers.h` for compile-time fixed size with `push_back()` interface (no dynamic allocation).
|
||||
2. **Compile-time-known fixed sizes with vector-like API:** Use `StaticVector` from `esphome/core/helpers.h` for fixed-size stack allocation with `push_back()` interface.
|
||||
```cpp
|
||||
// Bad - generates STL realloc code (_M_realloc_insert)
|
||||
std::vector<ServiceRecord> services;
|
||||
services.reserve(5); // Still includes reallocation machinery
|
||||
|
||||
// Good - compile-time fixed size, no dynamic allocation
|
||||
StaticVector<ServiceRecord, MAX_SERVICES> services;
|
||||
services.push_back(record1);
|
||||
// Good - compile-time fixed size, stack allocated, no reallocation machinery
|
||||
StaticVector<ServiceRecord, MAX_SERVICES> services; // Allocates all MAX_SERVICES on stack
|
||||
services.push_back(record1); // Tracks count but all slots allocated
|
||||
```
|
||||
Use `cg.add_define("MAX_SERVICES", count)` to set the size from Python configuration.
|
||||
Like `std::array` but with vector-like API (`push_back()`, `size()`) and no STL reallocation code.
|
||||
@@ -378,21 +372,22 @@ This document provides essential context for AI models interacting with this pro
|
||||
```
|
||||
Linear search on small datasets (1-16 elements) is often faster than hashing/tree overhead, but this depends on lookup frequency and access patterns. For frequent lookups in hot code paths, the O(1) vs O(n) complexity difference may still matter even for small datasets. `std::vector` with simple structs is usually fine—it's the heavy containers (`map`, `set`, `unordered_map`) that should be avoided for small datasets unless profiling shows otherwise.
|
||||
|
||||
5. **Avoid `std::deque`:** It allocates in 512-byte blocks regardless of element size, guaranteeing at least 512 bytes of RAM usage immediately. This is a major source of crashes on memory-constrained devices.
|
||||
|
||||
6. **Detection:** Look for these patterns in compiler output:
|
||||
5. **Detection:** Look for these patterns in compiler output:
|
||||
- Large code sections with STL symbols (vector, map, set)
|
||||
- `alloc`, `realloc`, `dealloc` in symbol names
|
||||
- `_M_realloc_insert`, `_M_default_append` (vector reallocation)
|
||||
- Red-black tree code (`rb_tree`, `_Rb_tree`)
|
||||
- Hash table infrastructure (`unordered_map`, `hash`)
|
||||
|
||||
**Prioritize optimization effort for:**
|
||||
**When to optimize:**
|
||||
- Core components (API, network, logger)
|
||||
- Widely-used components (mdns, wifi, ble)
|
||||
- Components causing flash size complaints
|
||||
|
||||
Note: Avoiding heap allocation after `setup()` is always required regardless of component type. The prioritization above is about the effort spent on container optimization (e.g., migrating from `std::vector` to `StaticVector`).
|
||||
**When not to optimize:**
|
||||
- Single-use niche components
|
||||
- Code where readability matters more than bytes
|
||||
- Already using appropriate containers
|
||||
|
||||
* **State Management:** Use `CORE.data` for component state that needs to persist during configuration generation. Avoid module-level mutable globals.
|
||||
|
||||
|
||||
@@ -1 +1 @@
|
||||
0f2b9a65dce7c59289d3aeb40936360a62a7be937b585147b45bb1509eaafb36
|
||||
d272a88e8ca28ae9340a9a03295a566432a52cb696501908f57764475bf7ca65
|
||||
|
||||
@@ -1,96 +0,0 @@
|
||||
---
|
||||
name: pr-workflow
|
||||
description: Create pull requests for esphome. Use when creating PRs, submitting changes, or preparing contributions.
|
||||
allowed-tools: Read, Bash, Glob, Grep
|
||||
---
|
||||
|
||||
# ESPHome PR Workflow
|
||||
|
||||
When creating a pull request for esphome, follow these steps:
|
||||
|
||||
## 1. Create Branch from Upstream
|
||||
|
||||
Always base your branch on **upstream** (not origin/fork) to ensure you have the latest code:
|
||||
|
||||
```bash
|
||||
git fetch upstream
|
||||
git checkout -b <branch-name> upstream/dev
|
||||
```
|
||||
|
||||
## 2. Read the PR Template
|
||||
|
||||
Before creating a PR, read `.github/PULL_REQUEST_TEMPLATE.md` to understand required fields.
|
||||
|
||||
## 3. Create the PR
|
||||
|
||||
Use `gh pr create` with the **full template** filled in. Never skip or abbreviate sections.
|
||||
|
||||
Required fields:
|
||||
- **What does this implement/fix?**: Brief description of changes
|
||||
- **Types of changes**: Check ONE appropriate box (Bugfix, New feature, Breaking change, etc.)
|
||||
- **Related issue**: Use `fixes <link>` syntax if applicable
|
||||
- **Pull request in esphome-docs**: Link if docs are needed
|
||||
- **Test Environment**: Check platforms you tested on
|
||||
- **Example config.yaml**: Include working example YAML
|
||||
- **Checklist**: Verify code is tested and tests added
|
||||
|
||||
## 4. Example PR Body
|
||||
|
||||
```markdown
|
||||
# What does this implement/fix?
|
||||
|
||||
<describe your changes here>
|
||||
|
||||
## Types of changes
|
||||
|
||||
- [ ] Bugfix (non-breaking change which fixes an issue)
|
||||
- [x] New feature (non-breaking change which adds functionality)
|
||||
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||
- [ ] Developer breaking change (an API change that could break external components)
|
||||
- [ ] Code quality improvements to existing code or addition of tests
|
||||
- [ ] Other
|
||||
|
||||
**Related issue or feature (if applicable):**
|
||||
|
||||
- fixes https://github.com/esphome/esphome/issues/XXX
|
||||
|
||||
**Pull request in [esphome-docs](https://github.com/esphome/esphome-docs) with documentation (if applicable):**
|
||||
|
||||
- esphome/esphome-docs#XXX
|
||||
|
||||
## Test Environment
|
||||
|
||||
- [x] ESP32
|
||||
- [x] ESP32 IDF
|
||||
- [ ] ESP8266
|
||||
- [ ] RP2040
|
||||
- [ ] BK72xx
|
||||
- [ ] RTL87xx
|
||||
- [ ] LN882x
|
||||
- [ ] nRF52840
|
||||
|
||||
## Example entry for `config.yaml`:
|
||||
|
||||
```yaml
|
||||
# Example config.yaml
|
||||
component_name:
|
||||
id: my_component
|
||||
option: value
|
||||
```
|
||||
|
||||
## Checklist:
|
||||
- [x] The code change is tested and works locally.
|
||||
- [x] Tests have been added to verify that the new code works (under `tests/` folder).
|
||||
|
||||
If user exposed functionality or configuration variables are added/changed:
|
||||
- [ ] Documentation added/updated in [esphome-docs](https://github.com/esphome/esphome-docs).
|
||||
```
|
||||
|
||||
## 5. Push and Create PR
|
||||
|
||||
```bash
|
||||
git push -u origin <branch-name>
|
||||
gh pr create --repo esphome/esphome --base dev --title "[component] Brief description"
|
||||
```
|
||||
|
||||
Title should be prefixed with the component name in brackets, e.g. `[safe_mode] Add feature`.
|
||||
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
1
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -27,7 +27,6 @@
|
||||
- [ ] RP2040
|
||||
- [ ] BK72xx
|
||||
- [ ] RTL87xx
|
||||
- [ ] LN882x
|
||||
- [ ] nRF52840
|
||||
|
||||
## Example entry for `config.yaml`:
|
||||
|
||||
4
.github/actions/restore-python/action.yml
vendored
4
.github/actions/restore-python/action.yml
vendored
@@ -17,12 +17,12 @@ runs:
|
||||
steps:
|
||||
- name: Set up Python ${{ inputs.python-version }}
|
||||
id: python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: ${{ inputs.python-version }}
|
||||
- name: Restore Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: venv
|
||||
# yamllint disable-line rule:line-length
|
||||
|
||||
38
.github/scripts/auto-label-pr/constants.js
vendored
38
.github/scripts/auto-label-pr/constants.js
vendored
@@ -1,38 +0,0 @@
|
||||
// Constants and markers for PR auto-labeling
|
||||
module.exports = {
|
||||
BOT_COMMENT_MARKER: '<!-- auto-label-pr-bot -->',
|
||||
CODEOWNERS_MARKER: '<!-- codeowners-request -->',
|
||||
TOO_BIG_MARKER: '<!-- too-big-request -->',
|
||||
DEPRECATED_COMPONENT_MARKER: '<!-- deprecated-component-request -->',
|
||||
|
||||
MANAGED_LABELS: [
|
||||
'new-component',
|
||||
'new-platform',
|
||||
'new-target-platform',
|
||||
'merging-to-release',
|
||||
'merging-to-beta',
|
||||
'chained-pr',
|
||||
'core',
|
||||
'small-pr',
|
||||
'dashboard',
|
||||
'github-actions',
|
||||
'by-code-owner',
|
||||
'has-tests',
|
||||
'needs-tests',
|
||||
'needs-docs',
|
||||
'needs-codeowners',
|
||||
'too-big',
|
||||
'labeller-recheck',
|
||||
'bugfix',
|
||||
'new-feature',
|
||||
'breaking-change',
|
||||
'developer-breaking-change',
|
||||
'code-quality',
|
||||
'deprecated-component'
|
||||
],
|
||||
|
||||
DOCS_PR_PATTERNS: [
|
||||
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
|
||||
/esphome\/esphome-docs#\d+/
|
||||
]
|
||||
};
|
||||
373
.github/scripts/auto-label-pr/detectors.js
vendored
373
.github/scripts/auto-label-pr/detectors.js
vendored
@@ -1,373 +0,0 @@
|
||||
const fs = require('fs');
|
||||
const { DOCS_PR_PATTERNS } = require('./constants');
|
||||
|
||||
// Strategy: Merge branch detection
|
||||
async function detectMergeBranch(context) {
|
||||
const labels = new Set();
|
||||
const baseRef = context.payload.pull_request.base.ref;
|
||||
|
||||
if (baseRef === 'release') {
|
||||
labels.add('merging-to-release');
|
||||
} else if (baseRef === 'beta') {
|
||||
labels.add('merging-to-beta');
|
||||
} else if (baseRef !== 'dev') {
|
||||
labels.add('chained-pr');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Component and platform labeling
|
||||
async function detectComponentPlatforms(changedFiles, apiData) {
|
||||
const labels = new Set();
|
||||
const componentRegex = /^esphome\/components\/([^\/]+)\//;
|
||||
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
|
||||
|
||||
for (const file of changedFiles) {
|
||||
const componentMatch = file.match(componentRegex);
|
||||
if (componentMatch) {
|
||||
labels.add(`component: ${componentMatch[1]}`);
|
||||
}
|
||||
|
||||
const platformMatch = file.match(targetPlatformRegex);
|
||||
if (platformMatch) {
|
||||
labels.add(`platform: ${platformMatch[1]}`);
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: New component detection
|
||||
async function detectNewComponents(prFiles) {
|
||||
const labels = new Set();
|
||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
||||
|
||||
for (const file of addedFiles) {
|
||||
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
|
||||
if (componentMatch) {
|
||||
try {
|
||||
const content = fs.readFileSync(file, 'utf8');
|
||||
if (content.includes('IS_TARGET_PLATFORM = True')) {
|
||||
labels.add('new-target-platform');
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(`Failed to read content of ${file}:`, error.message);
|
||||
}
|
||||
labels.add('new-component');
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: New platform detection
|
||||
async function detectNewPlatforms(prFiles, apiData) {
|
||||
const labels = new Set();
|
||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
||||
|
||||
for (const file of addedFiles) {
|
||||
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
|
||||
if (platformFileMatch) {
|
||||
const [, component, platform] = platformFileMatch;
|
||||
if (apiData.platformComponents.includes(platform)) {
|
||||
labels.add('new-platform');
|
||||
}
|
||||
}
|
||||
|
||||
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
|
||||
if (platformDirMatch) {
|
||||
const [, component, platform] = platformDirMatch;
|
||||
if (apiData.platformComponents.includes(platform)) {
|
||||
labels.add('new-platform');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Core files detection
|
||||
async function detectCoreChanges(changedFiles) {
|
||||
const labels = new Set();
|
||||
const coreFiles = changedFiles.filter(file =>
|
||||
file.startsWith('esphome/core/') ||
|
||||
(file.startsWith('esphome/') && file.split('/').length === 2)
|
||||
);
|
||||
|
||||
if (coreFiles.length > 0) {
|
||||
labels.add('core');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: PR size detection
|
||||
async function detectPRSize(prFiles, totalAdditions, totalDeletions, totalChanges, isMegaPR, SMALL_PR_THRESHOLD, TOO_BIG_THRESHOLD) {
|
||||
const labels = new Set();
|
||||
|
||||
if (totalChanges <= SMALL_PR_THRESHOLD) {
|
||||
labels.add('small-pr');
|
||||
return labels;
|
||||
}
|
||||
|
||||
const testAdditions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const testDeletions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
|
||||
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
|
||||
|
||||
// Don't add too-big if mega-pr label is already present
|
||||
if (nonTestChanges > TOO_BIG_THRESHOLD && !isMegaPR) {
|
||||
labels.add('too-big');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Dashboard changes
|
||||
async function detectDashboardChanges(changedFiles) {
|
||||
const labels = new Set();
|
||||
const dashboardFiles = changedFiles.filter(file =>
|
||||
file.startsWith('esphome/dashboard/') ||
|
||||
file.startsWith('esphome/components/dashboard_import/')
|
||||
);
|
||||
|
||||
if (dashboardFiles.length > 0) {
|
||||
labels.add('dashboard');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: GitHub Actions changes
|
||||
async function detectGitHubActionsChanges(changedFiles) {
|
||||
const labels = new Set();
|
||||
const githubActionsFiles = changedFiles.filter(file =>
|
||||
file.startsWith('.github/workflows/')
|
||||
);
|
||||
|
||||
if (githubActionsFiles.length > 0) {
|
||||
labels.add('github-actions');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Code owner detection
|
||||
async function detectCodeOwner(github, context, changedFiles) {
|
||||
const labels = new Set();
|
||||
const { owner, repo } = context.repo;
|
||||
|
||||
try {
|
||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
||||
owner,
|
||||
repo,
|
||||
path: 'CODEOWNERS',
|
||||
});
|
||||
|
||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
||||
const prAuthor = context.payload.pull_request.user.login;
|
||||
|
||||
const codeownersLines = codeownersContent.split('\n')
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('#'));
|
||||
|
||||
const codeownersRegexes = codeownersLines.map(line => {
|
||||
const parts = line.split(/\s+/);
|
||||
const pattern = parts[0];
|
||||
const owners = parts.slice(1);
|
||||
|
||||
let regex;
|
||||
if (pattern.endsWith('*')) {
|
||||
const dir = pattern.slice(0, -1);
|
||||
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
|
||||
} else if (pattern.includes('*')) {
|
||||
// First escape all regex special chars except *, then replace * with .*
|
||||
const regexPattern = pattern
|
||||
.replace(/[.+?^${}()|[\]\\]/g, '\\$&')
|
||||
.replace(/\*/g, '.*');
|
||||
regex = new RegExp(`^${regexPattern}$`);
|
||||
} else {
|
||||
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
|
||||
}
|
||||
|
||||
return { regex, owners };
|
||||
});
|
||||
|
||||
for (const file of changedFiles) {
|
||||
for (const { regex, owners } of codeownersRegexes) {
|
||||
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
|
||||
labels.add('by-code-owner');
|
||||
return labels;
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.log('Failed to read or parse CODEOWNERS file:', error.message);
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Test detection
|
||||
async function detectTests(changedFiles) {
|
||||
const labels = new Set();
|
||||
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
|
||||
|
||||
if (testFiles.length > 0) {
|
||||
labels.add('has-tests');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: PR Template Checkbox detection
|
||||
async function detectPRTemplateCheckboxes(context) {
|
||||
const labels = new Set();
|
||||
const prBody = context.payload.pull_request.body || '';
|
||||
|
||||
console.log('Checking PR template checkboxes...');
|
||||
|
||||
// Check for checked checkboxes in the "Types of changes" section
|
||||
const checkboxPatterns = [
|
||||
{ pattern: /- \[x\] Bugfix \(non-breaking change which fixes an issue\)/i, label: 'bugfix' },
|
||||
{ pattern: /- \[x\] New feature \(non-breaking change which adds functionality\)/i, label: 'new-feature' },
|
||||
{ pattern: /- \[x\] Breaking change \(fix or feature that would cause existing functionality to not work as expected\)/i, label: 'breaking-change' },
|
||||
{ pattern: /- \[x\] Developer breaking change \(an API change that could break external components\)/i, label: 'developer-breaking-change' },
|
||||
{ pattern: /- \[x\] Code quality improvements to existing code or addition of tests/i, label: 'code-quality' }
|
||||
];
|
||||
|
||||
for (const { pattern, label } of checkboxPatterns) {
|
||||
if (pattern.test(prBody)) {
|
||||
console.log(`Found checked checkbox for: ${label}`);
|
||||
labels.add(label);
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Deprecated component detection
|
||||
async function detectDeprecatedComponents(github, context, changedFiles) {
|
||||
const labels = new Set();
|
||||
const deprecatedInfo = [];
|
||||
const { owner, repo } = context.repo;
|
||||
|
||||
// Compile regex once for better performance
|
||||
const componentFileRegex = /^esphome\/components\/([^\/]+)\//;
|
||||
|
||||
// Get files that are modified or added in components directory
|
||||
const componentFiles = changedFiles.filter(file => componentFileRegex.test(file));
|
||||
|
||||
if (componentFiles.length === 0) {
|
||||
return { labels, deprecatedInfo };
|
||||
}
|
||||
|
||||
// Extract unique component names using the same regex
|
||||
const components = new Set();
|
||||
for (const file of componentFiles) {
|
||||
const match = file.match(componentFileRegex);
|
||||
if (match) {
|
||||
components.add(match[1]);
|
||||
}
|
||||
}
|
||||
|
||||
// Get PR head to fetch files from the PR branch
|
||||
const prNumber = context.payload.pull_request.number;
|
||||
|
||||
// Check each component's __init__.py for DEPRECATED_COMPONENT constant
|
||||
for (const component of components) {
|
||||
const initFile = `esphome/components/${component}/__init__.py`;
|
||||
try {
|
||||
// Fetch file content from PR head using GitHub API
|
||||
const { data: fileData } = await github.rest.repos.getContent({
|
||||
owner,
|
||||
repo,
|
||||
path: initFile,
|
||||
ref: `refs/pull/${prNumber}/head`
|
||||
});
|
||||
|
||||
// Decode base64 content
|
||||
const content = Buffer.from(fileData.content, 'base64').toString('utf8');
|
||||
|
||||
// Look for DEPRECATED_COMPONENT = "message" or DEPRECATED_COMPONENT = 'message'
|
||||
// Support single quotes, double quotes, and triple quotes (for multiline)
|
||||
const doubleQuoteMatch = content.match(/DEPRECATED_COMPONENT\s*=\s*"""([\s\S]*?)"""/s) ||
|
||||
content.match(/DEPRECATED_COMPONENT\s*=\s*"((?:[^"\\]|\\.)*)"/);
|
||||
const singleQuoteMatch = content.match(/DEPRECATED_COMPONENT\s*=\s*'''([\s\S]*?)'''/s) ||
|
||||
content.match(/DEPRECATED_COMPONENT\s*=\s*'((?:[^'\\]|\\.)*)'/);
|
||||
const deprecatedMatch = doubleQuoteMatch || singleQuoteMatch;
|
||||
|
||||
if (deprecatedMatch) {
|
||||
labels.add('deprecated-component');
|
||||
deprecatedInfo.push({
|
||||
component: component,
|
||||
message: deprecatedMatch[1].trim()
|
||||
});
|
||||
console.log(`Found deprecated component: ${component}`);
|
||||
}
|
||||
} catch (error) {
|
||||
// Only log if it's not a simple "file not found" error (404)
|
||||
if (error.status !== 404) {
|
||||
console.log(`Error reading ${initFile}:`, error.message);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { labels, deprecatedInfo };
|
||||
}
|
||||
|
||||
// Strategy: Requirements detection
|
||||
async function detectRequirements(allLabels, prFiles, context) {
|
||||
const labels = new Set();
|
||||
|
||||
// Check for missing tests
|
||||
if ((allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) && !allLabels.has('has-tests')) {
|
||||
labels.add('needs-tests');
|
||||
}
|
||||
|
||||
// Check for missing docs
|
||||
if (allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) {
|
||||
const prBody = context.payload.pull_request.body || '';
|
||||
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
|
||||
|
||||
if (!hasDocsLink) {
|
||||
labels.add('needs-docs');
|
||||
}
|
||||
}
|
||||
|
||||
// Check for missing CODEOWNERS
|
||||
if (allLabels.has('new-component')) {
|
||||
const codeownersModified = prFiles.some(file =>
|
||||
file.filename === 'CODEOWNERS' &&
|
||||
(file.status === 'modified' || file.status === 'added') &&
|
||||
(file.additions || 0) > 0
|
||||
);
|
||||
|
||||
if (!codeownersModified) {
|
||||
labels.add('needs-codeowners');
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
detectMergeBranch,
|
||||
detectComponentPlatforms,
|
||||
detectNewComponents,
|
||||
detectNewPlatforms,
|
||||
detectCoreChanges,
|
||||
detectPRSize,
|
||||
detectDashboardChanges,
|
||||
detectGitHubActionsChanges,
|
||||
detectCodeOwner,
|
||||
detectTests,
|
||||
detectPRTemplateCheckboxes,
|
||||
detectDeprecatedComponents,
|
||||
detectRequirements
|
||||
};
|
||||
187
.github/scripts/auto-label-pr/index.js
vendored
187
.github/scripts/auto-label-pr/index.js
vendored
@@ -1,187 +0,0 @@
|
||||
const { MANAGED_LABELS } = require('./constants');
|
||||
const {
|
||||
detectMergeBranch,
|
||||
detectComponentPlatforms,
|
||||
detectNewComponents,
|
||||
detectNewPlatforms,
|
||||
detectCoreChanges,
|
||||
detectPRSize,
|
||||
detectDashboardChanges,
|
||||
detectGitHubActionsChanges,
|
||||
detectCodeOwner,
|
||||
detectTests,
|
||||
detectPRTemplateCheckboxes,
|
||||
detectDeprecatedComponents,
|
||||
detectRequirements
|
||||
} = require('./detectors');
|
||||
const { handleReviews } = require('./reviews');
|
||||
const { applyLabels, removeOldLabels } = require('./labels');
|
||||
|
||||
// Fetch API data
|
||||
async function fetchApiData() {
|
||||
try {
|
||||
const response = await fetch('https://data.esphome.io/components.json');
|
||||
const componentsData = await response.json();
|
||||
return {
|
||||
targetPlatforms: componentsData.target_platforms || [],
|
||||
platformComponents: componentsData.platform_components || []
|
||||
};
|
||||
} catch (error) {
|
||||
console.log('Failed to fetch components data from API:', error.message);
|
||||
return { targetPlatforms: [], platformComponents: [] };
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = async ({ github, context }) => {
|
||||
// Environment variables
|
||||
const SMALL_PR_THRESHOLD = parseInt(process.env.SMALL_PR_THRESHOLD);
|
||||
const MAX_LABELS = parseInt(process.env.MAX_LABELS);
|
||||
const TOO_BIG_THRESHOLD = parseInt(process.env.TOO_BIG_THRESHOLD);
|
||||
const COMPONENT_LABEL_THRESHOLD = parseInt(process.env.COMPONENT_LABEL_THRESHOLD);
|
||||
|
||||
// Global state
|
||||
const { owner, repo } = context.repo;
|
||||
const pr_number = context.issue.number;
|
||||
|
||||
// Get current labels and PR data
|
||||
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number
|
||||
});
|
||||
const currentLabels = currentLabelsData.map(label => label.name);
|
||||
const managedLabels = currentLabels.filter(label =>
|
||||
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
|
||||
);
|
||||
|
||||
// Check for mega-PR early - if present, skip most automatic labeling
|
||||
const isMegaPR = currentLabels.includes('mega-pr');
|
||||
|
||||
// Get all PR files with automatic pagination
|
||||
const prFiles = await github.paginate(
|
||||
github.rest.pulls.listFiles,
|
||||
{
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
}
|
||||
);
|
||||
|
||||
// Calculate data from PR files
|
||||
const changedFiles = prFiles.map(file => file.filename);
|
||||
const totalAdditions = prFiles.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const totalDeletions = prFiles.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
const totalChanges = totalAdditions + totalDeletions;
|
||||
|
||||
console.log('Current labels:', currentLabels.join(', '));
|
||||
console.log('Changed files:', changedFiles.length);
|
||||
console.log('Total changes:', totalChanges);
|
||||
if (isMegaPR) {
|
||||
console.log('Mega-PR detected - applying limited labeling logic');
|
||||
}
|
||||
|
||||
// Fetch API data
|
||||
const apiData = await fetchApiData();
|
||||
const baseRef = context.payload.pull_request.base.ref;
|
||||
|
||||
// Early exit for release and beta branches only
|
||||
if (baseRef === 'release' || baseRef === 'beta') {
|
||||
const branchLabels = await detectMergeBranch(context);
|
||||
const finalLabels = Array.from(branchLabels);
|
||||
|
||||
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
|
||||
|
||||
// Apply labels
|
||||
await applyLabels(github, context, finalLabels);
|
||||
|
||||
// Remove old managed labels
|
||||
await removeOldLabels(github, context, managedLabels, finalLabels);
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
// Run all strategies
|
||||
const [
|
||||
branchLabels,
|
||||
componentLabels,
|
||||
newComponentLabels,
|
||||
newPlatformLabels,
|
||||
coreLabels,
|
||||
sizeLabels,
|
||||
dashboardLabels,
|
||||
actionsLabels,
|
||||
codeOwnerLabels,
|
||||
testLabels,
|
||||
checkboxLabels,
|
||||
deprecatedResult
|
||||
] = await Promise.all([
|
||||
detectMergeBranch(context),
|
||||
detectComponentPlatforms(changedFiles, apiData),
|
||||
detectNewComponents(prFiles),
|
||||
detectNewPlatforms(prFiles, apiData),
|
||||
detectCoreChanges(changedFiles),
|
||||
detectPRSize(prFiles, totalAdditions, totalDeletions, totalChanges, isMegaPR, SMALL_PR_THRESHOLD, TOO_BIG_THRESHOLD),
|
||||
detectDashboardChanges(changedFiles),
|
||||
detectGitHubActionsChanges(changedFiles),
|
||||
detectCodeOwner(github, context, changedFiles),
|
||||
detectTests(changedFiles),
|
||||
detectPRTemplateCheckboxes(context),
|
||||
detectDeprecatedComponents(github, context, changedFiles)
|
||||
]);
|
||||
|
||||
// Extract deprecated component info
|
||||
const deprecatedLabels = deprecatedResult.labels;
|
||||
const deprecatedInfo = deprecatedResult.deprecatedInfo;
|
||||
|
||||
// Combine all labels
|
||||
const allLabels = new Set([
|
||||
...branchLabels,
|
||||
...componentLabels,
|
||||
...newComponentLabels,
|
||||
...newPlatformLabels,
|
||||
...coreLabels,
|
||||
...sizeLabels,
|
||||
...dashboardLabels,
|
||||
...actionsLabels,
|
||||
...codeOwnerLabels,
|
||||
...testLabels,
|
||||
...checkboxLabels,
|
||||
...deprecatedLabels
|
||||
]);
|
||||
|
||||
// Detect requirements based on all other labels
|
||||
const requirementLabels = await detectRequirements(allLabels, prFiles, context);
|
||||
for (const label of requirementLabels) {
|
||||
allLabels.add(label);
|
||||
}
|
||||
|
||||
let finalLabels = Array.from(allLabels);
|
||||
|
||||
// For mega-PRs, exclude component labels if there are too many
|
||||
if (isMegaPR) {
|
||||
const componentLabels = finalLabels.filter(label => label.startsWith('component: '));
|
||||
if (componentLabels.length > COMPONENT_LABEL_THRESHOLD) {
|
||||
finalLabels = finalLabels.filter(label => !label.startsWith('component: '));
|
||||
console.log(`Mega-PR detected - excluding ${componentLabels.length} component labels (threshold: ${COMPONENT_LABEL_THRESHOLD})`);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle too many labels (only for non-mega PRs)
|
||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
||||
const originalLabelCount = finalLabels.length;
|
||||
|
||||
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
|
||||
finalLabels = ['too-big'];
|
||||
}
|
||||
|
||||
console.log('Computed labels:', finalLabels.join(', '));
|
||||
|
||||
// Handle reviews
|
||||
await handleReviews(github, context, finalLabels, originalLabelCount, deprecatedInfo, prFiles, totalAdditions, totalDeletions, MAX_LABELS, TOO_BIG_THRESHOLD);
|
||||
|
||||
// Apply labels
|
||||
await applyLabels(github, context, finalLabels);
|
||||
|
||||
// Remove old managed labels
|
||||
await removeOldLabels(github, context, managedLabels, finalLabels);
|
||||
};
|
||||
41
.github/scripts/auto-label-pr/labels.js
vendored
41
.github/scripts/auto-label-pr/labels.js
vendored
@@ -1,41 +0,0 @@
|
||||
// Apply labels to PR
|
||||
async function applyLabels(github, context, finalLabels) {
|
||||
const { owner, repo } = context.repo;
|
||||
const pr_number = context.issue.number;
|
||||
|
||||
if (finalLabels.length > 0) {
|
||||
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
labels: finalLabels
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Remove old managed labels
|
||||
async function removeOldLabels(github, context, managedLabels, finalLabels) {
|
||||
const { owner, repo } = context.repo;
|
||||
const pr_number = context.issue.number;
|
||||
|
||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
||||
for (const label of labelsToRemove) {
|
||||
console.log(`Removing label: ${label}`);
|
||||
try {
|
||||
await github.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
name: label
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(`Failed to remove label ${label}:`, error.message);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
applyLabels,
|
||||
removeOldLabels
|
||||
};
|
||||
141
.github/scripts/auto-label-pr/reviews.js
vendored
141
.github/scripts/auto-label-pr/reviews.js
vendored
@@ -1,141 +0,0 @@
|
||||
const {
|
||||
BOT_COMMENT_MARKER,
|
||||
CODEOWNERS_MARKER,
|
||||
TOO_BIG_MARKER,
|
||||
DEPRECATED_COMPONENT_MARKER
|
||||
} = require('./constants');
|
||||
|
||||
// Generate review messages
|
||||
function generateReviewMessages(finalLabels, originalLabelCount, deprecatedInfo, prFiles, totalAdditions, totalDeletions, prAuthor, MAX_LABELS, TOO_BIG_THRESHOLD) {
|
||||
const messages = [];
|
||||
|
||||
// Deprecated component message
|
||||
if (finalLabels.includes('deprecated-component') && deprecatedInfo && deprecatedInfo.length > 0) {
|
||||
let message = `${DEPRECATED_COMPONENT_MARKER}\n### ⚠️ Deprecated Component\n\n`;
|
||||
message += `Hey there @${prAuthor},\n`;
|
||||
message += `This PR modifies one or more deprecated components. Please be aware:\n\n`;
|
||||
|
||||
for (const info of deprecatedInfo) {
|
||||
message += `#### Component: \`${info.component}\`\n`;
|
||||
message += `${info.message}\n\n`;
|
||||
}
|
||||
|
||||
message += `Consider migrating to the recommended alternative if applicable.`;
|
||||
|
||||
messages.push(message);
|
||||
}
|
||||
|
||||
// Too big message
|
||||
if (finalLabels.includes('too-big')) {
|
||||
const testAdditions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const testDeletions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
|
||||
|
||||
const tooManyLabels = originalLabelCount > MAX_LABELS;
|
||||
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
|
||||
|
||||
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
|
||||
|
||||
if (tooManyLabels && tooManyChanges) {
|
||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${originalLabelCount} different components/areas.`;
|
||||
} else if (tooManyLabels) {
|
||||
message += `This PR affects ${originalLabelCount} different components/areas.`;
|
||||
} else {
|
||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
|
||||
}
|
||||
|
||||
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
|
||||
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
|
||||
|
||||
messages.push(message);
|
||||
}
|
||||
|
||||
// CODEOWNERS message
|
||||
if (finalLabels.includes('needs-codeowners')) {
|
||||
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
|
||||
`Hey there @${prAuthor},\n` +
|
||||
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
|
||||
`This way we can notify you if a bug report for this integration is reported.\n\n` +
|
||||
`In \`__init__.py\` of the integration, please add:\n\n` +
|
||||
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
|
||||
`And run \`script/build_codeowners.py\``;
|
||||
|
||||
messages.push(message);
|
||||
}
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
// Handle reviews
|
||||
async function handleReviews(github, context, finalLabels, originalLabelCount, deprecatedInfo, prFiles, totalAdditions, totalDeletions, MAX_LABELS, TOO_BIG_THRESHOLD) {
|
||||
const { owner, repo } = context.repo;
|
||||
const pr_number = context.issue.number;
|
||||
const prAuthor = context.payload.pull_request.user.login;
|
||||
|
||||
const reviewMessages = generateReviewMessages(finalLabels, originalLabelCount, deprecatedInfo, prFiles, totalAdditions, totalDeletions, prAuthor, MAX_LABELS, TOO_BIG_THRESHOLD);
|
||||
const hasReviewableLabels = finalLabels.some(label =>
|
||||
['too-big', 'needs-codeowners', 'deprecated-component'].includes(label)
|
||||
);
|
||||
|
||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
});
|
||||
|
||||
const botReviews = reviews.filter(review =>
|
||||
review.user.type === 'Bot' &&
|
||||
review.state === 'CHANGES_REQUESTED' &&
|
||||
review.body && review.body.includes(BOT_COMMENT_MARKER)
|
||||
);
|
||||
|
||||
if (hasReviewableLabels) {
|
||||
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
|
||||
|
||||
if (botReviews.length > 0) {
|
||||
// Update existing review
|
||||
await github.rest.pulls.updateReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
review_id: botReviews[0].id,
|
||||
body: reviewBody
|
||||
});
|
||||
console.log('Updated existing bot review');
|
||||
} else {
|
||||
// Create new review
|
||||
await github.rest.pulls.createReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
body: reviewBody,
|
||||
event: 'REQUEST_CHANGES'
|
||||
});
|
||||
console.log('Created new bot review');
|
||||
}
|
||||
} else if (botReviews.length > 0) {
|
||||
// Dismiss existing reviews
|
||||
for (const review of botReviews) {
|
||||
try {
|
||||
await github.rest.pulls.dismissReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
review_id: review.id,
|
||||
message: 'Review dismissed: All requirements have been met'
|
||||
});
|
||||
console.log(`Dismissed bot review ${review.id}`);
|
||||
} catch (error) {
|
||||
console.log(`Failed to dismiss review ${review.id}:`, error.message);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
handleReviews
|
||||
};
|
||||
634
.github/workflows/auto-label-pr.yml
vendored
634
.github/workflows/auto-label-pr.yml
vendored
@@ -22,7 +22,7 @@ jobs:
|
||||
if: github.event.action != 'labeled' || github.event.sender.type != 'Bot'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
|
||||
- name: Generate a token
|
||||
id: generate-token
|
||||
@@ -36,5 +36,633 @@ jobs:
|
||||
with:
|
||||
github-token: ${{ steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const script = require('./.github/scripts/auto-label-pr/index.js');
|
||||
await script({ github, context });
|
||||
const fs = require('fs');
|
||||
|
||||
// Constants
|
||||
const SMALL_PR_THRESHOLD = parseInt('${{ env.SMALL_PR_THRESHOLD }}');
|
||||
const MAX_LABELS = parseInt('${{ env.MAX_LABELS }}');
|
||||
const TOO_BIG_THRESHOLD = parseInt('${{ env.TOO_BIG_THRESHOLD }}');
|
||||
const COMPONENT_LABEL_THRESHOLD = parseInt('${{ env.COMPONENT_LABEL_THRESHOLD }}');
|
||||
const BOT_COMMENT_MARKER = '<!-- auto-label-pr-bot -->';
|
||||
const CODEOWNERS_MARKER = '<!-- codeowners-request -->';
|
||||
const TOO_BIG_MARKER = '<!-- too-big-request -->';
|
||||
|
||||
const MANAGED_LABELS = [
|
||||
'new-component',
|
||||
'new-platform',
|
||||
'new-target-platform',
|
||||
'merging-to-release',
|
||||
'merging-to-beta',
|
||||
'chained-pr',
|
||||
'core',
|
||||
'small-pr',
|
||||
'dashboard',
|
||||
'github-actions',
|
||||
'by-code-owner',
|
||||
'has-tests',
|
||||
'needs-tests',
|
||||
'needs-docs',
|
||||
'needs-codeowners',
|
||||
'too-big',
|
||||
'labeller-recheck',
|
||||
'bugfix',
|
||||
'new-feature',
|
||||
'breaking-change',
|
||||
'developer-breaking-change',
|
||||
'code-quality'
|
||||
];
|
||||
|
||||
const DOCS_PR_PATTERNS = [
|
||||
/https:\/\/github\.com\/esphome\/esphome-docs\/pull\/\d+/,
|
||||
/esphome\/esphome-docs#\d+/
|
||||
];
|
||||
|
||||
// Global state
|
||||
const { owner, repo } = context.repo;
|
||||
const pr_number = context.issue.number;
|
||||
|
||||
// Get current labels and PR data
|
||||
const { data: currentLabelsData } = await github.rest.issues.listLabelsOnIssue({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number
|
||||
});
|
||||
const currentLabels = currentLabelsData.map(label => label.name);
|
||||
const managedLabels = currentLabels.filter(label =>
|
||||
label.startsWith('component: ') || MANAGED_LABELS.includes(label)
|
||||
);
|
||||
|
||||
// Check for mega-PR early - if present, skip most automatic labeling
|
||||
const isMegaPR = currentLabels.includes('mega-pr');
|
||||
|
||||
// Get all PR files with automatic pagination
|
||||
const prFiles = await github.paginate(
|
||||
github.rest.pulls.listFiles,
|
||||
{
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
}
|
||||
);
|
||||
|
||||
// Calculate data from PR files
|
||||
const changedFiles = prFiles.map(file => file.filename);
|
||||
const totalAdditions = prFiles.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const totalDeletions = prFiles.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
const totalChanges = totalAdditions + totalDeletions;
|
||||
|
||||
console.log('Current labels:', currentLabels.join(', '));
|
||||
console.log('Changed files:', changedFiles.length);
|
||||
console.log('Total changes:', totalChanges);
|
||||
if (isMegaPR) {
|
||||
console.log('Mega-PR detected - applying limited labeling logic');
|
||||
}
|
||||
|
||||
// Fetch API data
|
||||
async function fetchApiData() {
|
||||
try {
|
||||
const response = await fetch('https://data.esphome.io/components.json');
|
||||
const componentsData = await response.json();
|
||||
return {
|
||||
targetPlatforms: componentsData.target_platforms || [],
|
||||
platformComponents: componentsData.platform_components || []
|
||||
};
|
||||
} catch (error) {
|
||||
console.log('Failed to fetch components data from API:', error.message);
|
||||
return { targetPlatforms: [], platformComponents: [] };
|
||||
}
|
||||
}
|
||||
|
||||
// Strategy: Merge branch detection
|
||||
async function detectMergeBranch() {
|
||||
const labels = new Set();
|
||||
const baseRef = context.payload.pull_request.base.ref;
|
||||
|
||||
if (baseRef === 'release') {
|
||||
labels.add('merging-to-release');
|
||||
} else if (baseRef === 'beta') {
|
||||
labels.add('merging-to-beta');
|
||||
} else if (baseRef !== 'dev') {
|
||||
labels.add('chained-pr');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Component and platform labeling
|
||||
async function detectComponentPlatforms(apiData) {
|
||||
const labels = new Set();
|
||||
const componentRegex = /^esphome\/components\/([^\/]+)\//;
|
||||
const targetPlatformRegex = new RegExp(`^esphome\/components\/(${apiData.targetPlatforms.join('|')})/`);
|
||||
|
||||
for (const file of changedFiles) {
|
||||
const componentMatch = file.match(componentRegex);
|
||||
if (componentMatch) {
|
||||
labels.add(`component: ${componentMatch[1]}`);
|
||||
}
|
||||
|
||||
const platformMatch = file.match(targetPlatformRegex);
|
||||
if (platformMatch) {
|
||||
labels.add(`platform: ${platformMatch[1]}`);
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: New component detection
|
||||
async function detectNewComponents() {
|
||||
const labels = new Set();
|
||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
||||
|
||||
for (const file of addedFiles) {
|
||||
const componentMatch = file.match(/^esphome\/components\/([^\/]+)\/__init__\.py$/);
|
||||
if (componentMatch) {
|
||||
try {
|
||||
const content = fs.readFileSync(file, 'utf8');
|
||||
if (content.includes('IS_TARGET_PLATFORM = True')) {
|
||||
labels.add('new-target-platform');
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(`Failed to read content of ${file}:`, error.message);
|
||||
}
|
||||
labels.add('new-component');
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: New platform detection
|
||||
async function detectNewPlatforms(apiData) {
|
||||
const labels = new Set();
|
||||
const addedFiles = prFiles.filter(file => file.status === 'added').map(file => file.filename);
|
||||
|
||||
for (const file of addedFiles) {
|
||||
const platformFileMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\.py$/);
|
||||
if (platformFileMatch) {
|
||||
const [, component, platform] = platformFileMatch;
|
||||
if (apiData.platformComponents.includes(platform)) {
|
||||
labels.add('new-platform');
|
||||
}
|
||||
}
|
||||
|
||||
const platformDirMatch = file.match(/^esphome\/components\/([^\/]+)\/([^\/]+)\/__init__\.py$/);
|
||||
if (platformDirMatch) {
|
||||
const [, component, platform] = platformDirMatch;
|
||||
if (apiData.platformComponents.includes(platform)) {
|
||||
labels.add('new-platform');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Core files detection
|
||||
async function detectCoreChanges() {
|
||||
const labels = new Set();
|
||||
const coreFiles = changedFiles.filter(file =>
|
||||
file.startsWith('esphome/core/') ||
|
||||
(file.startsWith('esphome/') && file.split('/').length === 2)
|
||||
);
|
||||
|
||||
if (coreFiles.length > 0) {
|
||||
labels.add('core');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: PR size detection
|
||||
async function detectPRSize() {
|
||||
const labels = new Set();
|
||||
|
||||
if (totalChanges <= SMALL_PR_THRESHOLD) {
|
||||
labels.add('small-pr');
|
||||
return labels;
|
||||
}
|
||||
|
||||
const testAdditions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const testDeletions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
|
||||
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
|
||||
|
||||
// Don't add too-big if mega-pr label is already present
|
||||
if (nonTestChanges > TOO_BIG_THRESHOLD && !isMegaPR) {
|
||||
labels.add('too-big');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Dashboard changes
|
||||
async function detectDashboardChanges() {
|
||||
const labels = new Set();
|
||||
const dashboardFiles = changedFiles.filter(file =>
|
||||
file.startsWith('esphome/dashboard/') ||
|
||||
file.startsWith('esphome/components/dashboard_import/')
|
||||
);
|
||||
|
||||
if (dashboardFiles.length > 0) {
|
||||
labels.add('dashboard');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: GitHub Actions changes
|
||||
async function detectGitHubActionsChanges() {
|
||||
const labels = new Set();
|
||||
const githubActionsFiles = changedFiles.filter(file =>
|
||||
file.startsWith('.github/workflows/')
|
||||
);
|
||||
|
||||
if (githubActionsFiles.length > 0) {
|
||||
labels.add('github-actions');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Code owner detection
|
||||
async function detectCodeOwner() {
|
||||
const labels = new Set();
|
||||
|
||||
try {
|
||||
const { data: codeownersFile } = await github.rest.repos.getContent({
|
||||
owner,
|
||||
repo,
|
||||
path: 'CODEOWNERS',
|
||||
});
|
||||
|
||||
const codeownersContent = Buffer.from(codeownersFile.content, 'base64').toString('utf8');
|
||||
const prAuthor = context.payload.pull_request.user.login;
|
||||
|
||||
const codeownersLines = codeownersContent.split('\n')
|
||||
.map(line => line.trim())
|
||||
.filter(line => line && !line.startsWith('#'));
|
||||
|
||||
const codeownersRegexes = codeownersLines.map(line => {
|
||||
const parts = line.split(/\s+/);
|
||||
const pattern = parts[0];
|
||||
const owners = parts.slice(1);
|
||||
|
||||
let regex;
|
||||
if (pattern.endsWith('*')) {
|
||||
const dir = pattern.slice(0, -1);
|
||||
regex = new RegExp(`^${dir.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}`);
|
||||
} else if (pattern.includes('*')) {
|
||||
// First escape all regex special chars except *, then replace * with .*
|
||||
const regexPattern = pattern
|
||||
.replace(/[.+?^${}()|[\]\\]/g, '\\$&')
|
||||
.replace(/\*/g, '.*');
|
||||
regex = new RegExp(`^${regexPattern}$`);
|
||||
} else {
|
||||
regex = new RegExp(`^${pattern.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}$`);
|
||||
}
|
||||
|
||||
return { regex, owners };
|
||||
});
|
||||
|
||||
for (const file of changedFiles) {
|
||||
for (const { regex, owners } of codeownersRegexes) {
|
||||
if (regex.test(file) && owners.some(owner => owner === `@${prAuthor}`)) {
|
||||
labels.add('by-code-owner');
|
||||
return labels;
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.log('Failed to read or parse CODEOWNERS file:', error.message);
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Test detection
|
||||
async function detectTests() {
|
||||
const labels = new Set();
|
||||
const testFiles = changedFiles.filter(file => file.startsWith('tests/'));
|
||||
|
||||
if (testFiles.length > 0) {
|
||||
labels.add('has-tests');
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: PR Template Checkbox detection
|
||||
async function detectPRTemplateCheckboxes() {
|
||||
const labels = new Set();
|
||||
const prBody = context.payload.pull_request.body || '';
|
||||
|
||||
console.log('Checking PR template checkboxes...');
|
||||
|
||||
// Check for checked checkboxes in the "Types of changes" section
|
||||
const checkboxPatterns = [
|
||||
{ pattern: /- \[x\] Bugfix \(non-breaking change which fixes an issue\)/i, label: 'bugfix' },
|
||||
{ pattern: /- \[x\] New feature \(non-breaking change which adds functionality\)/i, label: 'new-feature' },
|
||||
{ pattern: /- \[x\] Breaking change \(fix or feature that would cause existing functionality to not work as expected\)/i, label: 'breaking-change' },
|
||||
{ pattern: /- \[x\] Developer breaking change \(an API change that could break external components\)/i, label: 'developer-breaking-change' },
|
||||
{ pattern: /- \[x\] Code quality improvements to existing code or addition of tests/i, label: 'code-quality' }
|
||||
];
|
||||
|
||||
for (const { pattern, label } of checkboxPatterns) {
|
||||
if (pattern.test(prBody)) {
|
||||
console.log(`Found checked checkbox for: ${label}`);
|
||||
labels.add(label);
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Strategy: Requirements detection
|
||||
async function detectRequirements(allLabels) {
|
||||
const labels = new Set();
|
||||
|
||||
// Check for missing tests
|
||||
if ((allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) && !allLabels.has('has-tests')) {
|
||||
labels.add('needs-tests');
|
||||
}
|
||||
|
||||
// Check for missing docs
|
||||
if (allLabels.has('new-component') || allLabels.has('new-platform') || allLabels.has('new-feature')) {
|
||||
const prBody = context.payload.pull_request.body || '';
|
||||
const hasDocsLink = DOCS_PR_PATTERNS.some(pattern => pattern.test(prBody));
|
||||
|
||||
if (!hasDocsLink) {
|
||||
labels.add('needs-docs');
|
||||
}
|
||||
}
|
||||
|
||||
// Check for missing CODEOWNERS
|
||||
if (allLabels.has('new-component')) {
|
||||
const codeownersModified = prFiles.some(file =>
|
||||
file.filename === 'CODEOWNERS' &&
|
||||
(file.status === 'modified' || file.status === 'added') &&
|
||||
(file.additions || 0) > 0
|
||||
);
|
||||
|
||||
if (!codeownersModified) {
|
||||
labels.add('needs-codeowners');
|
||||
}
|
||||
}
|
||||
|
||||
return labels;
|
||||
}
|
||||
|
||||
// Generate review messages
|
||||
function generateReviewMessages(finalLabels, originalLabelCount) {
|
||||
const messages = [];
|
||||
const prAuthor = context.payload.pull_request.user.login;
|
||||
|
||||
// Too big message
|
||||
if (finalLabels.includes('too-big')) {
|
||||
const testAdditions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.additions || 0), 0);
|
||||
const testDeletions = prFiles
|
||||
.filter(file => file.filename.startsWith('tests/'))
|
||||
.reduce((sum, file) => sum + (file.deletions || 0), 0);
|
||||
const nonTestChanges = (totalAdditions - testAdditions) - (totalDeletions - testDeletions);
|
||||
|
||||
const tooManyLabels = originalLabelCount > MAX_LABELS;
|
||||
const tooManyChanges = nonTestChanges > TOO_BIG_THRESHOLD;
|
||||
|
||||
let message = `${TOO_BIG_MARKER}\n### 📦 Pull Request Size\n\n`;
|
||||
|
||||
if (tooManyLabels && tooManyChanges) {
|
||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests) and affects ${originalLabelCount} different components/areas.`;
|
||||
} else if (tooManyLabels) {
|
||||
message += `This PR affects ${originalLabelCount} different components/areas.`;
|
||||
} else {
|
||||
message += `This PR is too large with ${nonTestChanges} line changes (excluding tests).`;
|
||||
}
|
||||
|
||||
message += ` Please consider breaking it down into smaller, focused PRs to make review easier and reduce the risk of conflicts.\n\n`;
|
||||
message += `For guidance on breaking down large PRs, see: https://developers.esphome.io/contributing/submitting-your-work/#how-to-approach-large-submissions`;
|
||||
|
||||
messages.push(message);
|
||||
}
|
||||
|
||||
// CODEOWNERS message
|
||||
if (finalLabels.includes('needs-codeowners')) {
|
||||
const message = `${CODEOWNERS_MARKER}\n### 👥 Code Ownership\n\n` +
|
||||
`Hey there @${prAuthor},\n` +
|
||||
`Thanks for submitting this pull request! Can you add yourself as a codeowner for this integration? ` +
|
||||
`This way we can notify you if a bug report for this integration is reported.\n\n` +
|
||||
`In \`__init__.py\` of the integration, please add:\n\n` +
|
||||
`\`\`\`python\nCODEOWNERS = ["@${prAuthor}"]\n\`\`\`\n\n` +
|
||||
`And run \`script/build_codeowners.py\``;
|
||||
|
||||
messages.push(message);
|
||||
}
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
// Handle reviews
|
||||
async function handleReviews(finalLabels, originalLabelCount) {
|
||||
const reviewMessages = generateReviewMessages(finalLabels, originalLabelCount);
|
||||
const hasReviewableLabels = finalLabels.some(label =>
|
||||
['too-big', 'needs-codeowners'].includes(label)
|
||||
);
|
||||
|
||||
const { data: reviews } = await github.rest.pulls.listReviews({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number
|
||||
});
|
||||
|
||||
const botReviews = reviews.filter(review =>
|
||||
review.user.type === 'Bot' &&
|
||||
review.state === 'CHANGES_REQUESTED' &&
|
||||
review.body && review.body.includes(BOT_COMMENT_MARKER)
|
||||
);
|
||||
|
||||
if (hasReviewableLabels) {
|
||||
const reviewBody = `${BOT_COMMENT_MARKER}\n\n${reviewMessages.join('\n\n---\n\n')}`;
|
||||
|
||||
if (botReviews.length > 0) {
|
||||
// Update existing review
|
||||
await github.rest.pulls.updateReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
review_id: botReviews[0].id,
|
||||
body: reviewBody
|
||||
});
|
||||
console.log('Updated existing bot review');
|
||||
} else {
|
||||
// Create new review
|
||||
await github.rest.pulls.createReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
body: reviewBody,
|
||||
event: 'REQUEST_CHANGES'
|
||||
});
|
||||
console.log('Created new bot review');
|
||||
}
|
||||
} else if (botReviews.length > 0) {
|
||||
// Dismiss existing reviews
|
||||
for (const review of botReviews) {
|
||||
try {
|
||||
await github.rest.pulls.dismissReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr_number,
|
||||
review_id: review.id,
|
||||
message: 'Review dismissed: All requirements have been met'
|
||||
});
|
||||
console.log(`Dismissed bot review ${review.id}`);
|
||||
} catch (error) {
|
||||
console.log(`Failed to dismiss review ${review.id}:`, error.message);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Main execution
|
||||
const apiData = await fetchApiData();
|
||||
const baseRef = context.payload.pull_request.base.ref;
|
||||
|
||||
// Early exit for release and beta branches only
|
||||
if (baseRef === 'release' || baseRef === 'beta') {
|
||||
const branchLabels = await detectMergeBranch();
|
||||
const finalLabels = Array.from(branchLabels);
|
||||
|
||||
console.log('Computed labels (merge branch only):', finalLabels.join(', '));
|
||||
|
||||
// Apply labels
|
||||
if (finalLabels.length > 0) {
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
labels: finalLabels
|
||||
});
|
||||
}
|
||||
|
||||
// Remove old managed labels
|
||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
||||
for (const label of labelsToRemove) {
|
||||
try {
|
||||
await github.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
name: label
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(`Failed to remove label ${label}:`, error.message);
|
||||
}
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
// Run all strategies
|
||||
const [
|
||||
branchLabels,
|
||||
componentLabels,
|
||||
newComponentLabels,
|
||||
newPlatformLabels,
|
||||
coreLabels,
|
||||
sizeLabels,
|
||||
dashboardLabels,
|
||||
actionsLabels,
|
||||
codeOwnerLabels,
|
||||
testLabels,
|
||||
checkboxLabels
|
||||
] = await Promise.all([
|
||||
detectMergeBranch(),
|
||||
detectComponentPlatforms(apiData),
|
||||
detectNewComponents(),
|
||||
detectNewPlatforms(apiData),
|
||||
detectCoreChanges(),
|
||||
detectPRSize(),
|
||||
detectDashboardChanges(),
|
||||
detectGitHubActionsChanges(),
|
||||
detectCodeOwner(),
|
||||
detectTests(),
|
||||
detectPRTemplateCheckboxes()
|
||||
]);
|
||||
|
||||
// Combine all labels
|
||||
const allLabels = new Set([
|
||||
...branchLabels,
|
||||
...componentLabels,
|
||||
...newComponentLabels,
|
||||
...newPlatformLabels,
|
||||
...coreLabels,
|
||||
...sizeLabels,
|
||||
...dashboardLabels,
|
||||
...actionsLabels,
|
||||
...codeOwnerLabels,
|
||||
...testLabels,
|
||||
...checkboxLabels
|
||||
]);
|
||||
|
||||
// Detect requirements based on all other labels
|
||||
const requirementLabels = await detectRequirements(allLabels);
|
||||
for (const label of requirementLabels) {
|
||||
allLabels.add(label);
|
||||
}
|
||||
|
||||
let finalLabels = Array.from(allLabels);
|
||||
|
||||
// For mega-PRs, exclude component labels if there are too many
|
||||
if (isMegaPR) {
|
||||
const componentLabels = finalLabels.filter(label => label.startsWith('component: '));
|
||||
if (componentLabels.length > COMPONENT_LABEL_THRESHOLD) {
|
||||
finalLabels = finalLabels.filter(label => !label.startsWith('component: '));
|
||||
console.log(`Mega-PR detected - excluding ${componentLabels.length} component labels (threshold: ${COMPONENT_LABEL_THRESHOLD})`);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle too many labels (only for non-mega PRs)
|
||||
const tooManyLabels = finalLabels.length > MAX_LABELS;
|
||||
const originalLabelCount = finalLabels.length;
|
||||
|
||||
if (tooManyLabels && !isMegaPR && !finalLabels.includes('too-big')) {
|
||||
finalLabels = ['too-big'];
|
||||
}
|
||||
|
||||
console.log('Computed labels:', finalLabels.join(', '));
|
||||
|
||||
// Handle reviews
|
||||
await handleReviews(finalLabels, originalLabelCount);
|
||||
|
||||
// Apply labels
|
||||
if (finalLabels.length > 0) {
|
||||
console.log(`Adding labels: ${finalLabels.join(', ')}`);
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
labels: finalLabels
|
||||
});
|
||||
}
|
||||
|
||||
// Remove old managed labels
|
||||
const labelsToRemove = managedLabels.filter(label => !finalLabels.includes(label));
|
||||
for (const label of labelsToRemove) {
|
||||
console.log(`Removing label: ${label}`);
|
||||
try {
|
||||
await github.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr_number,
|
||||
name: label
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(`Failed to remove label ${label}:`, error.message);
|
||||
}
|
||||
}
|
||||
|
||||
4
.github/workflows/ci-api-proto.yml
vendored
4
.github/workflows/ci-api-proto.yml
vendored
@@ -21,9 +21,9 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
|
||||
|
||||
4
.github/workflows/ci-clang-tidy-hash.yml
vendored
4
.github/workflows/ci-clang-tidy-hash.yml
vendored
@@ -21,10 +21,10 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
|
||||
|
||||
4
.github/workflows/ci-docker.yml
vendored
4
.github/workflows/ci-docker.yml
vendored
@@ -43,9 +43,9 @@ jobs:
|
||||
- "docker"
|
||||
# - "lint"
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
- name: Set up Docker Buildx
|
||||
|
||||
@@ -49,7 +49,7 @@ jobs:
|
||||
|
||||
- name: Check out code from base repository
|
||||
if: steps.pr.outputs.skip != 'true'
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
# Always check out from the base repository (esphome/esphome), never from forks
|
||||
# Use the PR's target branch to ensure we run trusted code from the main repo
|
||||
|
||||
64
.github/workflows/ci.yml
vendored
64
.github/workflows/ci.yml
vendored
@@ -36,18 +36,18 @@ jobs:
|
||||
cache-key: ${{ steps.cache-key.outputs.key }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Generate cache-key
|
||||
id: cache-key
|
||||
run: echo key="${{ hashFiles('requirements.txt', 'requirements_test.txt', '.pre-commit-config.yaml') }}" >> $GITHUB_OUTPUT
|
||||
- name: Set up Python ${{ env.DEFAULT_PYTHON }}
|
||||
id: python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
- name: Restore Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: venv
|
||||
# yamllint disable-line rule:line-length
|
||||
@@ -70,7 +70,7 @@ jobs:
|
||||
if: needs.determine-jobs.outputs.python-linters == 'true'
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
@@ -91,7 +91,7 @@ jobs:
|
||||
- common
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
@@ -132,7 +132,7 @@ jobs:
|
||||
- common
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Restore Python
|
||||
id: restore-python
|
||||
uses: ./.github/actions/restore-python
|
||||
@@ -157,7 +157,7 @@ jobs:
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
- name: Save Python virtual environment cache
|
||||
if: github.ref == 'refs/heads/dev'
|
||||
uses: actions/cache/save@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/save@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: venv
|
||||
key: ${{ runner.os }}-${{ steps.restore-python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
||||
@@ -183,7 +183,7 @@ jobs:
|
||||
component-test-batches: ${{ steps.determine.outputs.component-test-batches }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
# Fetch enough history to find the merge base
|
||||
fetch-depth: 2
|
||||
@@ -193,7 +193,7 @@ jobs:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Restore components graph cache
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: .temp/components_graph.json
|
||||
key: components-graph-${{ hashFiles('esphome/components/**/*.py') }}
|
||||
@@ -223,7 +223,7 @@ jobs:
|
||||
echo "component-test-batches=$(echo "$output" | jq -c '.component_test_batches')" >> $GITHUB_OUTPUT
|
||||
- name: Save components graph cache
|
||||
if: github.ref == 'refs/heads/dev'
|
||||
uses: actions/cache/save@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/save@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: .temp/components_graph.json
|
||||
key: components-graph-${{ hashFiles('esphome/components/**/*.py') }}
|
||||
@@ -237,15 +237,15 @@ jobs:
|
||||
if: needs.determine-jobs.outputs.integration-tests == 'true'
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Set up Python 3.13
|
||||
id: python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: "3.13"
|
||||
- name: Restore Python virtual environment
|
||||
id: cache-venv
|
||||
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: venv
|
||||
key: ${{ runner.os }}-${{ steps.python.outputs.python-version }}-venv-${{ needs.common.outputs.cache-key }}
|
||||
@@ -273,7 +273,7 @@ jobs:
|
||||
if: github.event_name == 'pull_request' && (needs.determine-jobs.outputs.cpp-unit-tests-run-all == 'true' || needs.determine-jobs.outputs.cpp-unit-tests-components != '[]')
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
@@ -321,7 +321,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
# Need history for HEAD~1 to work for checking changed files
|
||||
fetch-depth: 2
|
||||
@@ -334,14 +334,14 @@ jobs:
|
||||
|
||||
- name: Cache platformio
|
||||
if: github.ref == 'refs/heads/dev'
|
||||
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
||||
|
||||
- name: Cache platformio
|
||||
if: github.ref != 'refs/heads/dev'
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-${{ matrix.pio_cache_key }}-${{ hashFiles('platformio.ini') }}
|
||||
@@ -400,7 +400,7 @@ jobs:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
# Need history for HEAD~1 to work for checking changed files
|
||||
fetch-depth: 2
|
||||
@@ -413,14 +413,14 @@ jobs:
|
||||
|
||||
- name: Cache platformio
|
||||
if: github.ref == 'refs/heads/dev'
|
||||
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-tidyesp32-${{ hashFiles('platformio.ini') }}
|
||||
|
||||
- name: Cache platformio
|
||||
if: github.ref != 'refs/heads/dev'
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-tidyesp32-${{ hashFiles('platformio.ini') }}
|
||||
@@ -489,7 +489,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
# Need history for HEAD~1 to work for checking changed files
|
||||
fetch-depth: 2
|
||||
@@ -502,14 +502,14 @@ jobs:
|
||||
|
||||
- name: Cache platformio
|
||||
if: github.ref == 'refs/heads/dev'
|
||||
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-tidyesp32-${{ hashFiles('platformio.ini') }}
|
||||
|
||||
- name: Cache platformio
|
||||
if: github.ref != 'refs/heads/dev'
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-tidyesp32-${{ hashFiles('platformio.ini') }}
|
||||
@@ -577,7 +577,7 @@ jobs:
|
||||
version: 1.0
|
||||
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
@@ -662,7 +662,7 @@ jobs:
|
||||
if: github.event_name == 'pull_request' && !startsWith(github.base_ref, 'beta') && !startsWith(github.base_ref, 'release')
|
||||
steps:
|
||||
- name: Check out code from GitHub
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
@@ -688,7 +688,7 @@ jobs:
|
||||
skip: ${{ steps.check-script.outputs.skip }}
|
||||
steps:
|
||||
- name: Check out target branch
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
ref: ${{ github.base_ref }}
|
||||
|
||||
@@ -735,7 +735,7 @@ jobs:
|
||||
- name: Restore cached memory analysis
|
||||
id: cache-memory-analysis
|
||||
if: steps.check-script.outputs.skip != 'true'
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: memory-analysis-target.json
|
||||
key: ${{ steps.cache-key.outputs.cache-key }}
|
||||
@@ -759,7 +759,7 @@ jobs:
|
||||
|
||||
- name: Cache platformio
|
||||
if: steps.check-script.outputs.skip != 'true' && steps.cache-memory-analysis.outputs.cache-hit != 'true'
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-memory-${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}-${{ hashFiles('platformio.ini') }}
|
||||
@@ -800,7 +800,7 @@ jobs:
|
||||
|
||||
- name: Save memory analysis to cache
|
||||
if: steps.check-script.outputs.skip != 'true' && steps.cache-memory-analysis.outputs.cache-hit != 'true' && steps.build.outcome == 'success'
|
||||
uses: actions/cache/save@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/save@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: memory-analysis-target.json
|
||||
key: ${{ steps.cache-key.outputs.cache-key }}
|
||||
@@ -840,14 +840,14 @@ jobs:
|
||||
flash_usage: ${{ steps.extract.outputs.flash_usage }}
|
||||
steps:
|
||||
- name: Check out PR branch
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
python-version: ${{ env.DEFAULT_PYTHON }}
|
||||
cache-key: ${{ needs.common.outputs.cache-key }}
|
||||
- name: Cache platformio
|
||||
uses: actions/cache/restore@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
|
||||
uses: actions/cache/restore@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
|
||||
with:
|
||||
path: ~/.platformio
|
||||
key: platformio-memory-${{ fromJSON(needs.determine-jobs.outputs.memory_impact).platform }}-${{ hashFiles('platformio.ini') }}
|
||||
@@ -908,7 +908,7 @@ jobs:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
steps:
|
||||
- name: Check out code
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Restore Python
|
||||
uses: ./.github/actions/restore-python
|
||||
with:
|
||||
|
||||
6
.github/workflows/codeql.yml
vendored
6
.github/workflows/codeql.yml
vendored
@@ -54,11 +54,11 @@ jobs:
|
||||
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@45cbd0c69e560cd9e7cd7f8c32362050c9b7ded2 # v4.32.2
|
||||
uses: github/codeql-action/init@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
build-mode: ${{ matrix.build-mode }}
|
||||
@@ -86,6 +86,6 @@ jobs:
|
||||
exit 1
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@45cbd0c69e560cd9e7cd7f8c32362050c9b7ded2 # v4.32.2
|
||||
uses: github/codeql-action/analyze@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
|
||||
20
.github/workflows/release.yml
vendored
20
.github/workflows/release.yml
vendored
@@ -20,7 +20,7 @@ jobs:
|
||||
branch_build: ${{ steps.tag.outputs.branch_build }}
|
||||
deploy_env: ${{ steps.tag.outputs.deploy_env }}
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Get tag
|
||||
id: tag
|
||||
# yamllint disable rule:line-length
|
||||
@@ -60,9 +60,9 @@ jobs:
|
||||
contents: read
|
||||
id-token: write
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: "3.x"
|
||||
- name: Build
|
||||
@@ -92,9 +92,9 @@ jobs:
|
||||
os: "ubuntu-24.04-arm"
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: "3.11"
|
||||
|
||||
@@ -102,12 +102,12 @@ jobs:
|
||||
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
|
||||
|
||||
- name: Log in to docker hub
|
||||
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
- name: Log in to the GitHub container registry
|
||||
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
@@ -168,7 +168,7 @@ jobs:
|
||||
- ghcr
|
||||
- dockerhub
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
|
||||
- name: Download digests
|
||||
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
|
||||
@@ -182,13 +182,13 @@ jobs:
|
||||
|
||||
- name: Log in to docker hub
|
||||
if: matrix.registry == 'dockerhub'
|
||||
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
- name: Log in to the GitHub container registry
|
||||
if: matrix.registry == 'ghcr'
|
||||
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
|
||||
8
.github/workflows/sync-device-classes.yml
vendored
8
.github/workflows/sync-device-classes.yml
vendored
@@ -13,16 +13,16 @@ jobs:
|
||||
if: github.repository == 'esphome/esphome'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
|
||||
- name: Checkout Home Assistant
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
repository: home-assistant/core
|
||||
path: lib/home-assistant
|
||||
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
|
||||
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
|
||||
with:
|
||||
python-version: 3.13
|
||||
|
||||
@@ -41,7 +41,7 @@ jobs:
|
||||
python script/run-in-env.py pre-commit run --all-files
|
||||
|
||||
- name: Commit changes
|
||||
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8.1.0
|
||||
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
|
||||
with:
|
||||
commit-message: "Synchronise Device Classes from Home Assistant"
|
||||
committer: esphomebot <esphome@openhomefoundation.org>
|
||||
|
||||
@@ -11,7 +11,7 @@ ci:
|
||||
repos:
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.15.0
|
||||
rev: v0.14.11
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
|
||||
10
CODEOWNERS
10
CODEOWNERS
@@ -88,8 +88,7 @@ esphome/components/bmp3xx/* @latonita
|
||||
esphome/components/bmp3xx_base/* @latonita @martgras
|
||||
esphome/components/bmp3xx_i2c/* @latonita
|
||||
esphome/components/bmp3xx_spi/* @latonita
|
||||
esphome/components/bmp581_base/* @danielkent-net @kahrendt
|
||||
esphome/components/bmp581_i2c/* @danielkent-net @kahrendt
|
||||
esphome/components/bmp581/* @kahrendt
|
||||
esphome/components/bp1658cj/* @Cossid
|
||||
esphome/components/bp5758d/* @Cossid
|
||||
esphome/components/bthome_mithermometer/* @nagyrobi
|
||||
@@ -104,7 +103,6 @@ esphome/components/cc1101/* @gabest11 @lygris
|
||||
esphome/components/ccs811/* @habbie
|
||||
esphome/components/cd74hc4067/* @asoehlke
|
||||
esphome/components/ch422g/* @clydebarrow @jesterret
|
||||
esphome/components/ch423/* @dwmw2
|
||||
esphome/components/chsc6x/* @kkosik20
|
||||
esphome/components/climate/* @esphome/core
|
||||
esphome/components/climate_ir/* @glmnet
|
||||
@@ -134,7 +132,6 @@ esphome/components/dfplayer/* @glmnet
|
||||
esphome/components/dfrobot_sen0395/* @niklasweber
|
||||
esphome/components/dht/* @OttoWinter
|
||||
esphome/components/display_menu_base/* @numo68
|
||||
esphome/components/dlms_meter/* @SimonFischer04
|
||||
esphome/components/dps310/* @kbx81
|
||||
esphome/components/ds1307/* @badbadc0ffee
|
||||
esphome/components/ds2484/* @mrk-its
|
||||
@@ -252,13 +249,11 @@ esphome/components/ina260/* @mreditor97
|
||||
esphome/components/ina2xx_base/* @latonita
|
||||
esphome/components/ina2xx_i2c/* @latonita
|
||||
esphome/components/ina2xx_spi/* @latonita
|
||||
esphome/components/infrared/* @kbx81
|
||||
esphome/components/inkbird_ibsth1_mini/* @fkirill
|
||||
esphome/components/inkplate/* @jesserockz @JosipKuci
|
||||
esphome/components/integration/* @OttoWinter
|
||||
esphome/components/internal_temperature/* @Mat931
|
||||
esphome/components/interval/* @esphome/core
|
||||
esphome/components/ir_rf_proxy/* @kbx81
|
||||
esphome/components/jsn_sr04t/* @Mafus1
|
||||
esphome/components/json/* @esphome/core
|
||||
esphome/components/kamstrup_kmp/* @cfeenstra1024
|
||||
@@ -484,7 +479,6 @@ esphome/components/switch/* @esphome/core
|
||||
esphome/components/switch/binary_sensor/* @ssieb
|
||||
esphome/components/sx126x/* @swoboda1337
|
||||
esphome/components/sx127x/* @swoboda1337
|
||||
esphome/components/sy6970/* @linkedupbits
|
||||
esphome/components/syslog/* @clydebarrow
|
||||
esphome/components/t6615/* @tylermenezes
|
||||
esphome/components/tc74/* @sethgirvan
|
||||
@@ -532,7 +526,7 @@ esphome/components/uart/packet_transport/* @clydebarrow
|
||||
esphome/components/udp/* @clydebarrow
|
||||
esphome/components/ufire_ec/* @pvizeli
|
||||
esphome/components/ufire_ise/* @pvizeli
|
||||
esphome/components/ultrasonic/* @ssieb @swoboda1337
|
||||
esphome/components/ultrasonic/* @OttoWinter
|
||||
esphome/components/update/* @jesserockz
|
||||
esphome/components/uponor_smatrix/* @kroimon
|
||||
esphome/components/usb_cdc_acm/* @kbx81
|
||||
|
||||
2
Doxyfile
2
Doxyfile
@@ -48,7 +48,7 @@ PROJECT_NAME = ESPHome
|
||||
# could be handy for archiving the generated documentation or if some version
|
||||
# control system is used.
|
||||
|
||||
PROJECT_NUMBER = 2026.2.0-dev
|
||||
PROJECT_NUMBER = 2026.1.0-dev
|
||||
|
||||
# Using the PROJECT_BRIEF tag one can provide an optional one line description
|
||||
# for a project that appears at the top of each page and should give viewer a
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
# PYTHON_ARGCOMPLETE_OK
|
||||
import argparse
|
||||
from collections.abc import Callable
|
||||
from datetime import datetime
|
||||
import functools
|
||||
import getpass
|
||||
@@ -35,6 +34,7 @@ from esphome.const import (
|
||||
CONF_MDNS,
|
||||
CONF_MQTT,
|
||||
CONF_NAME,
|
||||
CONF_NAME_ADD_MAC_SUFFIX,
|
||||
CONF_OTA,
|
||||
CONF_PASSWORD,
|
||||
CONF_PLATFORM,
|
||||
@@ -43,7 +43,6 @@ from esphome.const import (
|
||||
CONF_SUBSTITUTIONS,
|
||||
CONF_TOPIC,
|
||||
ENV_NOGITIGNORE,
|
||||
KEY_NATIVE_IDF,
|
||||
PLATFORM_ESP32,
|
||||
PLATFORM_ESP8266,
|
||||
PLATFORM_RP2040,
|
||||
@@ -61,6 +60,7 @@ from esphome.util import (
|
||||
run_external_process,
|
||||
safe_print,
|
||||
)
|
||||
from esphome.zeroconf import discover_mdns_devices
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
@@ -117,7 +117,6 @@ class ArgsProtocol(Protocol):
|
||||
configuration: str
|
||||
name: str
|
||||
upload_speed: str | None
|
||||
native_idf: bool
|
||||
|
||||
|
||||
def choose_prompt(options, purpose: str = None):
|
||||
@@ -225,13 +224,8 @@ def choose_upload_log_host(
|
||||
else:
|
||||
resolved.append(device)
|
||||
if not resolved:
|
||||
if CORE.dashboard:
|
||||
hint = "If you know the IP, set 'use_address' in your network config."
|
||||
else:
|
||||
hint = "If you know the IP, try --device <IP>"
|
||||
raise EsphomeError(
|
||||
f"All specified devices {defaults} could not be resolved. "
|
||||
f"Is the device connected to the network? {hint}"
|
||||
f"All specified devices {defaults} could not be resolved. Is the device connected to the network?"
|
||||
)
|
||||
return resolved
|
||||
|
||||
@@ -240,22 +234,34 @@ def choose_upload_log_host(
|
||||
(f"{port.path} ({port.description})", port.path) for port in get_serial_ports()
|
||||
]
|
||||
|
||||
def add_ota_options() -> None:
|
||||
"""Add OTA options, using mDNS discovery if name_add_mac_suffix is enabled."""
|
||||
if has_name_add_mac_suffix() and has_mdns() and has_non_ip_address():
|
||||
# Discover devices via mDNS when name_add_mac_suffix is enabled
|
||||
safe_print("Discovering devices...")
|
||||
discovered = discover_mdns_devices(CORE.name)
|
||||
for device_addr in discovered:
|
||||
options.append((f"Over The Air ({device_addr})", device_addr))
|
||||
if not discovered and has_resolvable_address():
|
||||
# No devices found, show base address as fallback
|
||||
options.append(
|
||||
(f"Over The Air ({CORE.address}) (no devices found)", CORE.address)
|
||||
)
|
||||
elif has_resolvable_address():
|
||||
options.append((f"Over The Air ({CORE.address})", CORE.address))
|
||||
if has_mqtt_ip_lookup():
|
||||
options.append(("Over The Air (MQTT IP lookup)", "MQTTIP"))
|
||||
|
||||
if purpose == Purpose.LOGGING:
|
||||
if has_mqtt_logging():
|
||||
mqtt_config = CORE.config[CONF_MQTT]
|
||||
options.append((f"MQTT ({mqtt_config[CONF_BROKER]})", "MQTT"))
|
||||
|
||||
if has_api():
|
||||
if has_resolvable_address():
|
||||
options.append((f"Over The Air ({CORE.address})", CORE.address))
|
||||
if has_mqtt_ip_lookup():
|
||||
options.append(("Over The Air (MQTT IP lookup)", "MQTTIP"))
|
||||
add_ota_options()
|
||||
|
||||
elif purpose == Purpose.UPLOADING and has_ota():
|
||||
if has_resolvable_address():
|
||||
options.append((f"Over The Air ({CORE.address})", CORE.address))
|
||||
if has_mqtt_ip_lookup():
|
||||
options.append(("Over The Air (MQTT IP lookup)", "MQTTIP"))
|
||||
add_ota_options()
|
||||
|
||||
if check_default is not None and check_default in [opt[1] for opt in options]:
|
||||
return [check_default]
|
||||
@@ -294,13 +300,8 @@ def has_api() -> bool:
|
||||
|
||||
|
||||
def has_ota() -> bool:
|
||||
"""Check if OTA upload is available (requires platform: esphome)."""
|
||||
if CONF_OTA not in CORE.config:
|
||||
return False
|
||||
return any(
|
||||
ota_item.get(CONF_PLATFORM) == CONF_ESPHOME
|
||||
for ota_item in CORE.config[CONF_OTA]
|
||||
)
|
||||
"""Check if OTA is available."""
|
||||
return CONF_OTA in CORE.config
|
||||
|
||||
|
||||
def has_mqtt_ip_lookup() -> bool:
|
||||
@@ -347,6 +348,14 @@ def has_resolvable_address() -> bool:
|
||||
return not CORE.address.endswith(".local")
|
||||
|
||||
|
||||
def has_name_add_mac_suffix() -> bool:
|
||||
"""Check if name_add_mac_suffix is enabled in the config."""
|
||||
if CORE.config is None:
|
||||
return False
|
||||
esphome_config = CORE.config.get(CONF_ESPHOME, {})
|
||||
return esphome_config.get(CONF_NAME_ADD_MAC_SUFFIX, False)
|
||||
|
||||
|
||||
def mqtt_get_ip(config: ConfigType, username: str, password: str, client_id: str):
|
||||
from esphome import mqtt
|
||||
|
||||
@@ -507,15 +516,12 @@ def wrap_to_code(name, comp):
|
||||
return wrapped
|
||||
|
||||
|
||||
def write_cpp(config: ConfigType, native_idf: bool = False) -> int:
|
||||
def write_cpp(config: ConfigType) -> int:
|
||||
if not get_bool_env(ENV_NOGITIGNORE):
|
||||
writer.write_gitignore()
|
||||
|
||||
# Store native_idf flag so esp32 component can check it
|
||||
CORE.data[KEY_NATIVE_IDF] = native_idf
|
||||
|
||||
generate_cpp_contents(config)
|
||||
return write_cpp_file(native_idf=native_idf)
|
||||
return write_cpp_file()
|
||||
|
||||
|
||||
def generate_cpp_contents(config: ConfigType) -> None:
|
||||
@@ -529,54 +535,32 @@ def generate_cpp_contents(config: ConfigType) -> None:
|
||||
CORE.flush_tasks()
|
||||
|
||||
|
||||
def write_cpp_file(native_idf: bool = False) -> int:
|
||||
def write_cpp_file() -> int:
|
||||
code_s = indent(CORE.cpp_main_section)
|
||||
writer.write_cpp(code_s)
|
||||
|
||||
if native_idf and CORE.is_esp32 and CORE.target_framework == "esp-idf":
|
||||
from esphome.build_gen import espidf
|
||||
from esphome.build_gen import platformio
|
||||
|
||||
espidf.write_project()
|
||||
else:
|
||||
from esphome.build_gen import platformio
|
||||
|
||||
platformio.write_project()
|
||||
platformio.write_project()
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def compile_program(args: ArgsProtocol, config: ConfigType) -> int:
|
||||
native_idf = getattr(args, "native_idf", False)
|
||||
from esphome import platformio_api
|
||||
|
||||
# NOTE: "Build path:" format is parsed by script/ci_memory_impact_extract.py
|
||||
# If you change this format, update the regex in that script as well
|
||||
_LOGGER.info("Compiling app... Build path: %s", CORE.build_path)
|
||||
|
||||
if native_idf and CORE.is_esp32 and CORE.target_framework == "esp-idf":
|
||||
from esphome import espidf_api
|
||||
|
||||
rc = espidf_api.run_compile(config, CORE.verbose)
|
||||
if rc != 0:
|
||||
return rc
|
||||
|
||||
# Create factory.bin and ota.bin
|
||||
espidf_api.create_factory_bin()
|
||||
espidf_api.create_ota_bin()
|
||||
else:
|
||||
from esphome import platformio_api
|
||||
|
||||
rc = platformio_api.run_compile(config, CORE.verbose)
|
||||
if rc != 0:
|
||||
return rc
|
||||
|
||||
idedata = platformio_api.get_idedata(config)
|
||||
if idedata is None:
|
||||
return 1
|
||||
rc = platformio_api.run_compile(config, CORE.verbose)
|
||||
if rc != 0:
|
||||
return rc
|
||||
|
||||
# Check if firmware was rebuilt and emit build_info + create manifest
|
||||
_check_and_emit_build_info()
|
||||
|
||||
return 0
|
||||
idedata = platformio_api.get_idedata(config)
|
||||
return 0 if idedata is not None else 1
|
||||
|
||||
|
||||
def _check_and_emit_build_info() -> None:
|
||||
@@ -833,8 +817,7 @@ def command_vscode(args: ArgsProtocol) -> int | None:
|
||||
|
||||
|
||||
def command_compile(args: ArgsProtocol, config: ConfigType) -> int | None:
|
||||
native_idf = getattr(args, "native_idf", False)
|
||||
exit_code = write_cpp(config, native_idf=native_idf)
|
||||
exit_code = write_cpp(config)
|
||||
if exit_code != 0:
|
||||
return exit_code
|
||||
if args.only_generate:
|
||||
@@ -889,8 +872,7 @@ def command_logs(args: ArgsProtocol, config: ConfigType) -> int | None:
|
||||
|
||||
|
||||
def command_run(args: ArgsProtocol, config: ConfigType) -> int | None:
|
||||
native_idf = getattr(args, "native_idf", False)
|
||||
exit_code = write_cpp(config, native_idf=native_idf)
|
||||
exit_code = write_cpp(config)
|
||||
if exit_code != 0:
|
||||
return exit_code
|
||||
exit_code = compile_program(args, config)
|
||||
@@ -971,21 +953,11 @@ def command_dashboard(args: ArgsProtocol) -> int | None:
|
||||
return dashboard.start_dashboard(args)
|
||||
|
||||
|
||||
def run_multiple_configs(
|
||||
files: list, command_builder: Callable[[str], list[str]]
|
||||
) -> int:
|
||||
"""Run a command for each configuration file in a subprocess.
|
||||
|
||||
Args:
|
||||
files: List of configuration files to process.
|
||||
command_builder: Callable that takes a file path and returns a command list.
|
||||
|
||||
Returns:
|
||||
Number of failed files.
|
||||
"""
|
||||
def command_update_all(args: ArgsProtocol) -> int | None:
|
||||
import click
|
||||
|
||||
success = {}
|
||||
files = list_yaml_files(args.configuration)
|
||||
twidth = 60
|
||||
|
||||
def print_bar(middle_text):
|
||||
@@ -995,19 +967,17 @@ def run_multiple_configs(
|
||||
safe_print(f"{half_line}{middle_text}{half_line}")
|
||||
|
||||
for f in files:
|
||||
f_path = Path(f) if not isinstance(f, Path) else f
|
||||
|
||||
if any(f_path.name == x for x in SECRETS_FILES):
|
||||
_LOGGER.warning("Skipping secrets file %s", f_path)
|
||||
continue
|
||||
|
||||
safe_print(f"Processing {color(AnsiFore.CYAN, str(f))}")
|
||||
safe_print(f"Updating {color(AnsiFore.CYAN, str(f))}")
|
||||
safe_print("-" * twidth)
|
||||
safe_print()
|
||||
|
||||
cmd = command_builder(f)
|
||||
rc = run_external_process(*cmd)
|
||||
|
||||
if CORE.dashboard:
|
||||
rc = run_external_process(
|
||||
"esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"
|
||||
)
|
||||
else:
|
||||
rc = run_external_process(
|
||||
"esphome", "run", f, "--no-logs", "--device", "OTA"
|
||||
)
|
||||
if rc == 0:
|
||||
print_bar(f"[{color(AnsiFore.BOLD_GREEN, 'SUCCESS')}] {str(f)}")
|
||||
success[f] = True
|
||||
@@ -1022,8 +992,6 @@ def run_multiple_configs(
|
||||
print_bar(f"[{color(AnsiFore.BOLD_WHITE, 'SUMMARY')}]")
|
||||
failed = 0
|
||||
for f in files:
|
||||
if f not in success:
|
||||
continue # Skipped file
|
||||
if success[f]:
|
||||
safe_print(f" - {str(f)}: {color(AnsiFore.GREEN, 'SUCCESS')}")
|
||||
else:
|
||||
@@ -1032,17 +1000,6 @@ def run_multiple_configs(
|
||||
return failed
|
||||
|
||||
|
||||
def command_update_all(args: ArgsProtocol) -> int | None:
|
||||
files = list_yaml_files(args.configuration)
|
||||
|
||||
def build_command(f):
|
||||
if CORE.dashboard:
|
||||
return ["esphome", "--dashboard", "run", f, "--no-logs", "--device", "OTA"]
|
||||
return ["esphome", "run", f, "--no-logs", "--device", "OTA"]
|
||||
|
||||
return run_multiple_configs(files, build_command)
|
||||
|
||||
|
||||
def command_idedata(args: ArgsProtocol, config: ConfigType) -> int:
|
||||
import json
|
||||
|
||||
@@ -1344,11 +1301,6 @@ def parse_args(argv):
|
||||
help="Only generate source code, do not compile.",
|
||||
action="store_true",
|
||||
)
|
||||
parser_compile.add_argument(
|
||||
"--native-idf",
|
||||
help="Build with native ESP-IDF instead of PlatformIO (ESP32 esp-idf framework only).",
|
||||
action="store_true",
|
||||
)
|
||||
|
||||
parser_upload = subparsers.add_parser(
|
||||
"upload",
|
||||
@@ -1430,11 +1382,6 @@ def parse_args(argv):
|
||||
help="Reset the device before starting serial logs.",
|
||||
default=os.getenv("ESPHOME_SERIAL_LOGGING_RESET"),
|
||||
)
|
||||
parser_run.add_argument(
|
||||
"--native-idf",
|
||||
help="Build with native ESP-IDF instead of PlatformIO (ESP32 esp-idf framework only).",
|
||||
action="store_true",
|
||||
)
|
||||
|
||||
parser_clean = subparsers.add_parser(
|
||||
"clean-mqtt",
|
||||
@@ -1603,48 +1550,38 @@ def run_esphome(argv):
|
||||
|
||||
_LOGGER.info("ESPHome %s", const.__version__)
|
||||
|
||||
# Multiple configurations: use subprocesses to avoid state leakage
|
||||
# between compilations (e.g., LVGL touchscreen state in module globals)
|
||||
if len(args.configuration) > 1:
|
||||
# Build command by reusing argv, replacing all configs with single file
|
||||
# argv[0] is the program path, skip it since we prefix with "esphome"
|
||||
def build_command(f):
|
||||
return (
|
||||
["esphome"]
|
||||
+ [arg for arg in argv[1:] if arg not in args.configuration]
|
||||
+ [str(f)]
|
||||
)
|
||||
for conf_path in args.configuration:
|
||||
conf_path = Path(conf_path)
|
||||
if any(conf_path.name == x for x in SECRETS_FILES):
|
||||
_LOGGER.warning("Skipping secrets file %s", conf_path)
|
||||
continue
|
||||
|
||||
return run_multiple_configs(args.configuration, build_command)
|
||||
CORE.config_path = conf_path
|
||||
CORE.dashboard = args.dashboard
|
||||
|
||||
# Single configuration
|
||||
conf_path = Path(args.configuration[0])
|
||||
if any(conf_path.name == x for x in SECRETS_FILES):
|
||||
_LOGGER.warning("Skipping secrets file %s", conf_path)
|
||||
return 0
|
||||
# For logs command, skip updating external components
|
||||
skip_external = args.command == "logs"
|
||||
config = read_config(
|
||||
dict(args.substitution) if args.substitution else {},
|
||||
skip_external_update=skip_external,
|
||||
)
|
||||
if config is None:
|
||||
return 2
|
||||
CORE.config = config
|
||||
|
||||
CORE.config_path = conf_path
|
||||
CORE.dashboard = args.dashboard
|
||||
if args.command not in POST_CONFIG_ACTIONS:
|
||||
safe_print(f"Unknown command {args.command}")
|
||||
|
||||
# For logs command, skip updating external components
|
||||
skip_external = args.command == "logs"
|
||||
config = read_config(
|
||||
dict(args.substitution) if args.substitution else {},
|
||||
skip_external_update=skip_external,
|
||||
)
|
||||
if config is None:
|
||||
return 2
|
||||
CORE.config = config
|
||||
try:
|
||||
rc = POST_CONFIG_ACTIONS[args.command](args, config)
|
||||
except EsphomeError as e:
|
||||
_LOGGER.error(e, exc_info=args.verbose)
|
||||
return 1
|
||||
if rc != 0:
|
||||
return rc
|
||||
|
||||
if args.command not in POST_CONFIG_ACTIONS:
|
||||
safe_print(f"Unknown command {args.command}")
|
||||
return 1
|
||||
|
||||
try:
|
||||
return POST_CONFIG_ACTIONS[args.command](args, config)
|
||||
except EsphomeError as e:
|
||||
_LOGGER.error(e, exc_info=args.verbose)
|
||||
return 1
|
||||
CORE.reset()
|
||||
return 0
|
||||
|
||||
|
||||
def main():
|
||||
|
||||
@@ -12,6 +12,7 @@ from .const import (
|
||||
CORE_SUBCATEGORY_PATTERNS,
|
||||
DEMANGLED_PATTERNS,
|
||||
ESPHOME_COMPONENT_PATTERN,
|
||||
SECTION_TO_ATTR,
|
||||
SYMBOL_PATTERNS,
|
||||
)
|
||||
from .demangle import batch_demangle
|
||||
@@ -21,7 +22,7 @@ from .helpers import (
|
||||
map_section_name,
|
||||
parse_symbol_line,
|
||||
)
|
||||
from .toolchain import find_tool, resolve_tool_path, run_tool
|
||||
from .toolchain import find_tool, run_tool
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from esphome.platformio_api import IDEData
|
||||
@@ -90,17 +91,6 @@ class ComponentMemory:
|
||||
bss_size: int = 0 # Uninitialized data (ram only)
|
||||
symbol_count: int = 0
|
||||
|
||||
def add_section_size(self, section_name: str, size: int) -> None:
|
||||
"""Add size to the appropriate attribute for a section."""
|
||||
if section_name == ".text":
|
||||
self.text_size += size
|
||||
elif section_name == ".rodata":
|
||||
self.rodata_size += size
|
||||
elif section_name == ".data":
|
||||
self.data_size += size
|
||||
elif section_name == ".bss":
|
||||
self.bss_size += size
|
||||
|
||||
@property
|
||||
def flash_total(self) -> int:
|
||||
"""Total flash usage (text + rodata + data)."""
|
||||
@@ -142,12 +132,6 @@ class MemoryAnalyzer:
|
||||
readelf_path = readelf_path or idedata.readelf_path
|
||||
_LOGGER.debug("Using toolchain paths from PlatformIO idedata")
|
||||
|
||||
# Validate paths exist, fall back to find_tool if they don't
|
||||
# This handles cases like Zephyr where cc_path doesn't include full path
|
||||
# and the toolchain prefix may differ (e.g., arm-zephyr-eabi- vs arm-none-eabi-)
|
||||
objdump_path = resolve_tool_path("objdump", objdump_path, objdump_path)
|
||||
readelf_path = resolve_tool_path("readelf", readelf_path, objdump_path)
|
||||
|
||||
self.objdump_path = objdump_path or "objdump"
|
||||
self.readelf_path = readelf_path or "readelf"
|
||||
self.external_components = external_components or set()
|
||||
@@ -177,15 +161,12 @@ class MemoryAnalyzer:
|
||||
self._elf_symbol_names: set[str] = set()
|
||||
# SDK symbols not in ELF (static/local symbols from closed-source libs)
|
||||
self._sdk_symbols: list[SDKSymbol] = []
|
||||
# CSWTCH symbols: list of (name, size, source_file, component)
|
||||
self._cswtch_symbols: list[tuple[str, int, str, str]] = []
|
||||
|
||||
def analyze(self) -> dict[str, ComponentMemory]:
|
||||
"""Analyze the ELF file and return component memory usage."""
|
||||
self._parse_sections()
|
||||
self._parse_symbols()
|
||||
self._categorize_symbols()
|
||||
self._analyze_cswtch_symbols()
|
||||
self._analyze_sdk_libraries()
|
||||
return dict(self.components)
|
||||
|
||||
@@ -268,7 +249,8 @@ class MemoryAnalyzer:
|
||||
comp_mem.symbol_count += 1
|
||||
|
||||
# Update the appropriate size attribute based on section
|
||||
comp_mem.add_section_size(section_name, size)
|
||||
if attr_name := SECTION_TO_ATTR.get(section_name):
|
||||
setattr(comp_mem, attr_name, getattr(comp_mem, attr_name) + size)
|
||||
|
||||
# Track uncategorized symbols
|
||||
if component == "other" and size > 0:
|
||||
@@ -384,277 +366,6 @@ class MemoryAnalyzer:
|
||||
|
||||
return "Other Core"
|
||||
|
||||
def _find_object_files_dir(self) -> Path | None:
|
||||
"""Find the directory containing object files for this build.
|
||||
|
||||
Returns:
|
||||
Path to the directory containing .o files, or None if not found.
|
||||
"""
|
||||
# The ELF is typically at .pioenvs/<env>/firmware.elf
|
||||
# Object files are in .pioenvs/<env>/src/ and .pioenvs/<env>/lib*/
|
||||
pioenvs_dir = self.elf_path.parent
|
||||
if pioenvs_dir.exists() and any(pioenvs_dir.glob("src/*.o")):
|
||||
return pioenvs_dir
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def _parse_nm_cswtch_output(
|
||||
output: str,
|
||||
base_dir: Path | None,
|
||||
cswtch_map: dict[str, list[tuple[str, int]]],
|
||||
) -> None:
|
||||
"""Parse nm output for CSWTCH symbols and add to cswtch_map.
|
||||
|
||||
Handles both ``.o`` files and ``.a`` archives.
|
||||
|
||||
nm output formats::
|
||||
|
||||
.o files: /path/file.o:hex_addr hex_size type name
|
||||
.a files: /path/lib.a:member.o:hex_addr hex_size type name
|
||||
|
||||
For ``.o`` files, paths are made relative to *base_dir* when possible.
|
||||
For ``.a`` archives (detected by ``:`` in the file portion), paths are
|
||||
formatted as ``archive_stem/member.o`` (e.g. ``liblwip2-536-feat/lwip-esp.o``).
|
||||
|
||||
Args:
|
||||
output: Raw stdout from ``nm --print-file-name -S``.
|
||||
base_dir: Base directory for computing relative paths of ``.o`` files.
|
||||
Pass ``None`` when scanning archives outside the build tree.
|
||||
cswtch_map: Dict to populate, mapping ``"CSWTCH$N:size"`` to source list.
|
||||
"""
|
||||
for line in output.splitlines():
|
||||
if "CSWTCH$" not in line:
|
||||
continue
|
||||
|
||||
# Split on last ":" that precedes a hex address.
|
||||
# For .o: "filepath.o" : "hex_addr hex_size type name"
|
||||
# For .a: "filepath.a:member.o" : "hex_addr hex_size type name"
|
||||
parts_after_colon = line.rsplit(":", 1)
|
||||
if len(parts_after_colon) != 2:
|
||||
continue
|
||||
|
||||
file_path = parts_after_colon[0]
|
||||
fields = parts_after_colon[1].split()
|
||||
# fields: [address, size, type, name]
|
||||
if len(fields) < 4:
|
||||
continue
|
||||
|
||||
sym_name = fields[3]
|
||||
if not sym_name.startswith("CSWTCH$"):
|
||||
continue
|
||||
|
||||
try:
|
||||
size = int(fields[1], 16)
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
# Determine readable source path
|
||||
# Use ".a:" to detect archive format (not bare ":" which matches
|
||||
# Windows drive letters like "C:\...\file.o").
|
||||
if ".a:" in file_path:
|
||||
# Archive format: "archive.a:member.o" → "archive_stem/member.o"
|
||||
archive_part, member = file_path.rsplit(":", 1)
|
||||
archive_name = Path(archive_part).stem
|
||||
rel_path = f"{archive_name}/{member}"
|
||||
elif base_dir is not None:
|
||||
try:
|
||||
rel_path = str(Path(file_path).relative_to(base_dir))
|
||||
except ValueError:
|
||||
rel_path = file_path
|
||||
else:
|
||||
rel_path = file_path
|
||||
|
||||
key = f"{sym_name}:{size}"
|
||||
cswtch_map[key].append((rel_path, size))
|
||||
|
||||
def _run_nm_cswtch_scan(
|
||||
self,
|
||||
files: list[Path],
|
||||
base_dir: Path | None,
|
||||
cswtch_map: dict[str, list[tuple[str, int]]],
|
||||
) -> None:
|
||||
"""Run nm on *files* and add any CSWTCH symbols to *cswtch_map*.
|
||||
|
||||
Args:
|
||||
files: Object (``.o``) or archive (``.a``) files to scan.
|
||||
base_dir: Base directory for relative path computation (see
|
||||
:meth:`_parse_nm_cswtch_output`).
|
||||
cswtch_map: Dict to populate with results.
|
||||
"""
|
||||
if not self.nm_path or not files:
|
||||
return
|
||||
|
||||
_LOGGER.debug("Scanning %d files for CSWTCH symbols", len(files))
|
||||
|
||||
result = run_tool(
|
||||
[self.nm_path, "--print-file-name", "-S"] + [str(f) for f in files],
|
||||
timeout=30,
|
||||
)
|
||||
if result is None or result.returncode != 0:
|
||||
_LOGGER.debug(
|
||||
"nm failed or timed out scanning %d files for CSWTCH symbols",
|
||||
len(files),
|
||||
)
|
||||
return
|
||||
|
||||
self._parse_nm_cswtch_output(result.stdout, base_dir, cswtch_map)
|
||||
|
||||
def _scan_cswtch_in_sdk_archives(
|
||||
self, cswtch_map: dict[str, list[tuple[str, int]]]
|
||||
) -> None:
|
||||
"""Scan SDK library archives (.a) for CSWTCH symbols.
|
||||
|
||||
Prebuilt SDK libraries (e.g. lwip, bearssl) are not compiled from source,
|
||||
so their CSWTCH symbols only exist inside ``.a`` archives. Results are
|
||||
merged into *cswtch_map* for keys not already found in ``.o`` files.
|
||||
|
||||
The same source file (e.g. ``lwip-esp.o``) often appears in multiple
|
||||
library variants (``liblwip2-536.a``, ``liblwip2-1460-feat.a``, etc.),
|
||||
so results are deduplicated by member name.
|
||||
"""
|
||||
sdk_dirs = self._find_sdk_library_dirs()
|
||||
if not sdk_dirs:
|
||||
return
|
||||
|
||||
sdk_archives = sorted(a for sdk_dir in sdk_dirs for a in sdk_dir.glob("*.a"))
|
||||
|
||||
sdk_map: dict[str, list[tuple[str, int]]] = defaultdict(list)
|
||||
self._run_nm_cswtch_scan(sdk_archives, None, sdk_map)
|
||||
|
||||
# Merge SDK results, deduplicating by member name.
|
||||
for key, sources in sdk_map.items():
|
||||
if key in cswtch_map:
|
||||
continue
|
||||
seen: dict[str, tuple[str, int]] = {}
|
||||
for path, sz in sources:
|
||||
member = Path(path).name
|
||||
if member not in seen:
|
||||
seen[member] = (path, sz)
|
||||
cswtch_map[key] = list(seen.values())
|
||||
|
||||
def _source_file_to_component(self, source_file: str) -> str:
|
||||
"""Map a source object file path to its component name.
|
||||
|
||||
Args:
|
||||
source_file: Relative path like 'src/esphome/components/wifi/wifi_component.cpp.o'
|
||||
|
||||
Returns:
|
||||
Component name like '[esphome]wifi' or the source file if unknown.
|
||||
"""
|
||||
parts = Path(source_file).parts
|
||||
|
||||
# ESPHome component: src/esphome/components/<name>/...
|
||||
if "components" in parts:
|
||||
idx = parts.index("components")
|
||||
if idx + 1 < len(parts):
|
||||
component_name = parts[idx + 1]
|
||||
if component_name in get_esphome_components():
|
||||
return f"{_COMPONENT_PREFIX_ESPHOME}{component_name}"
|
||||
if component_name in self.external_components:
|
||||
return f"{_COMPONENT_PREFIX_EXTERNAL}{component_name}"
|
||||
|
||||
# ESPHome core: src/esphome/core/... or src/esphome/...
|
||||
if "core" in parts and "esphome" in parts:
|
||||
return _COMPONENT_CORE
|
||||
if "esphome" in parts and "components" not in parts:
|
||||
return _COMPONENT_CORE
|
||||
|
||||
# Framework/library files - return the first path component
|
||||
# e.g., lib65b/ESPAsyncTCP/... -> lib65b
|
||||
# FrameworkArduino/... -> FrameworkArduino
|
||||
return parts[0] if parts else source_file
|
||||
|
||||
def _analyze_cswtch_symbols(self) -> None:
|
||||
"""Analyze CSWTCH (GCC switch table) symbols by tracing to source objects.
|
||||
|
||||
CSWTCH symbols are compiler-generated lookup tables for switch statements.
|
||||
They are local symbols, so the same name can appear in different object files.
|
||||
This method scans .o files and SDK archives to attribute them to their
|
||||
source components.
|
||||
"""
|
||||
obj_dir = self._find_object_files_dir()
|
||||
if obj_dir is None:
|
||||
_LOGGER.debug("No object files directory found, skipping CSWTCH analysis")
|
||||
return
|
||||
|
||||
# Scan build-dir object files for CSWTCH symbols
|
||||
cswtch_map: dict[str, list[tuple[str, int]]] = defaultdict(list)
|
||||
self._run_nm_cswtch_scan(sorted(obj_dir.rglob("*.o")), obj_dir, cswtch_map)
|
||||
|
||||
# Also scan SDK library archives (.a) for CSWTCH symbols.
|
||||
# Prebuilt SDK libraries (e.g. lwip, bearssl) are not compiled from source
|
||||
# so their symbols only exist inside .a archives, not as loose .o files.
|
||||
self._scan_cswtch_in_sdk_archives(cswtch_map)
|
||||
|
||||
if not cswtch_map:
|
||||
_LOGGER.debug("No CSWTCH symbols found in object files or SDK archives")
|
||||
return
|
||||
|
||||
# Collect CSWTCH symbols from the ELF (already parsed in sections)
|
||||
# Include section_name for re-attribution of component totals
|
||||
elf_cswtch = [
|
||||
(symbol_name, size, section_name)
|
||||
for section_name, section in self.sections.items()
|
||||
for symbol_name, size, _ in section.symbols
|
||||
if symbol_name.startswith("CSWTCH$")
|
||||
]
|
||||
|
||||
_LOGGER.debug(
|
||||
"Found %d CSWTCH symbols in ELF, %d unique in object files",
|
||||
len(elf_cswtch),
|
||||
len(cswtch_map),
|
||||
)
|
||||
|
||||
# Match ELF CSWTCH symbols to source files and re-attribute component totals.
|
||||
# _categorize_symbols() already ran and put these into "other" since CSWTCH$
|
||||
# names don't match any component pattern. We move the bytes to the correct
|
||||
# component based on the object file mapping.
|
||||
other_mem = self.components.get("other")
|
||||
|
||||
for sym_name, size, section_name in elf_cswtch:
|
||||
key = f"{sym_name}:{size}"
|
||||
sources = cswtch_map.get(key, [])
|
||||
|
||||
if len(sources) == 1:
|
||||
source_file = sources[0][0]
|
||||
component = self._source_file_to_component(source_file)
|
||||
elif len(sources) > 1:
|
||||
# Ambiguous - multiple object files have same CSWTCH name+size
|
||||
source_file = "ambiguous"
|
||||
component = "ambiguous"
|
||||
_LOGGER.debug(
|
||||
"Ambiguous CSWTCH %s (%d B) found in %d files: %s",
|
||||
sym_name,
|
||||
size,
|
||||
len(sources),
|
||||
", ".join(src for src, _ in sources),
|
||||
)
|
||||
else:
|
||||
source_file = "unknown"
|
||||
component = "unknown"
|
||||
|
||||
self._cswtch_symbols.append((sym_name, size, source_file, component))
|
||||
|
||||
# Re-attribute from "other" to the correct component
|
||||
if (
|
||||
component not in ("other", "unknown", "ambiguous")
|
||||
and other_mem is not None
|
||||
):
|
||||
other_mem.add_section_size(section_name, -size)
|
||||
if component not in self.components:
|
||||
self.components[component] = ComponentMemory(component)
|
||||
self.components[component].add_section_size(section_name, size)
|
||||
|
||||
# Sort by size descending
|
||||
self._cswtch_symbols.sort(key=lambda x: x[1], reverse=True)
|
||||
|
||||
total_size = sum(size for _, size, _, _ in self._cswtch_symbols)
|
||||
_LOGGER.debug(
|
||||
"CSWTCH analysis: %d symbols, %d bytes total",
|
||||
len(self._cswtch_symbols),
|
||||
total_size,
|
||||
)
|
||||
|
||||
def get_unattributed_ram(self) -> tuple[int, int, int]:
|
||||
"""Get unattributed RAM sizes (SDK/framework overhead).
|
||||
|
||||
|
||||
@@ -4,9 +4,6 @@ from __future__ import annotations
|
||||
|
||||
from collections import defaultdict
|
||||
from collections.abc import Callable
|
||||
import heapq
|
||||
import json
|
||||
from operator import itemgetter
|
||||
import sys
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
@@ -32,10 +29,6 @@ class MemoryAnalyzerCLI(MemoryAnalyzer):
|
||||
)
|
||||
# Lower threshold for RAM symbols (RAM is more constrained)
|
||||
RAM_SYMBOL_SIZE_THRESHOLD: int = 24
|
||||
# Number of top symbols to show in the largest symbols report
|
||||
TOP_SYMBOLS_LIMIT: int = 30
|
||||
# Width for symbol name display in top symbols report
|
||||
COL_TOP_SYMBOL_NAME: int = 55
|
||||
|
||||
# Column width constants
|
||||
COL_COMPONENT: int = 29
|
||||
@@ -154,83 +147,6 @@ class MemoryAnalyzerCLI(MemoryAnalyzer):
|
||||
section_label = f" [{section[1:]}]" # .data -> [data], .bss -> [bss]
|
||||
return f"{demangled} ({size:,} B){section_label}"
|
||||
|
||||
def _add_top_symbols(self, lines: list[str]) -> None:
|
||||
"""Add a section showing the top largest symbols in the binary."""
|
||||
# Collect all symbols from all components: (symbol, demangled, size, section, component)
|
||||
all_symbols = [
|
||||
(symbol, demangled, size, section, component)
|
||||
for component, symbols in self._component_symbols.items()
|
||||
for symbol, demangled, size, section in symbols
|
||||
]
|
||||
|
||||
# Get top N symbols by size using heapq for efficiency
|
||||
top_symbols = heapq.nlargest(
|
||||
self.TOP_SYMBOLS_LIMIT, all_symbols, key=itemgetter(2)
|
||||
)
|
||||
|
||||
lines.append("")
|
||||
lines.append(f"Top {self.TOP_SYMBOLS_LIMIT} Largest Symbols:")
|
||||
# Calculate truncation limit from column width (leaving room for "...")
|
||||
truncate_limit = self.COL_TOP_SYMBOL_NAME - 3
|
||||
for i, (_, demangled, size, section, component) in enumerate(top_symbols):
|
||||
# Format section label
|
||||
section_label = f"[{section[1:]}]" if section else ""
|
||||
# Truncate demangled name if too long
|
||||
demangled_display = (
|
||||
f"{demangled[:truncate_limit]}..."
|
||||
if len(demangled) > self.COL_TOP_SYMBOL_NAME
|
||||
else demangled
|
||||
)
|
||||
lines.append(
|
||||
f"{i + 1:>2}. {size:>7,} B {section_label:<8} {demangled_display:<{self.COL_TOP_SYMBOL_NAME}} {component}"
|
||||
)
|
||||
|
||||
def _add_cswtch_analysis(self, lines: list[str]) -> None:
|
||||
"""Add CSWTCH (GCC switch table lookup) analysis section."""
|
||||
self._add_section_header(lines, "CSWTCH Analysis (GCC Switch Table Lookups)")
|
||||
|
||||
total_size = sum(size for _, size, _, _ in self._cswtch_symbols)
|
||||
lines.append(
|
||||
f"Total: {len(self._cswtch_symbols)} switch table(s), {total_size:,} B"
|
||||
)
|
||||
lines.append("")
|
||||
|
||||
# Group by component
|
||||
by_component: dict[str, list[tuple[str, int, str]]] = defaultdict(list)
|
||||
for sym_name, size, source_file, component in self._cswtch_symbols:
|
||||
by_component[component].append((sym_name, size, source_file))
|
||||
|
||||
# Sort components by total size descending
|
||||
sorted_components = sorted(
|
||||
by_component.items(),
|
||||
key=lambda x: sum(s[1] for s in x[1]),
|
||||
reverse=True,
|
||||
)
|
||||
|
||||
for component, symbols in sorted_components:
|
||||
comp_total = sum(s[1] for s in symbols)
|
||||
lines.append(f"{component} ({comp_total:,} B, {len(symbols)} tables):")
|
||||
|
||||
# Group by source file within component
|
||||
by_file: dict[str, list[tuple[str, int]]] = defaultdict(list)
|
||||
for sym_name, size, source_file in symbols:
|
||||
by_file[source_file].append((sym_name, size))
|
||||
|
||||
for source_file, file_symbols in sorted(
|
||||
by_file.items(),
|
||||
key=lambda x: sum(s[1] for s in x[1]),
|
||||
reverse=True,
|
||||
):
|
||||
file_total = sum(s[1] for s in file_symbols)
|
||||
lines.append(
|
||||
f" {source_file} ({file_total:,} B, {len(file_symbols)} tables)"
|
||||
)
|
||||
for sym_name, size in sorted(
|
||||
file_symbols, key=lambda x: x[1], reverse=True
|
||||
):
|
||||
lines.append(f" {size:>6,} B {sym_name}")
|
||||
lines.append("")
|
||||
|
||||
def generate_report(self, detailed: bool = False) -> str:
|
||||
"""Generate a formatted memory report."""
|
||||
components = sorted(
|
||||
@@ -332,9 +248,6 @@ class MemoryAnalyzerCLI(MemoryAnalyzer):
|
||||
"RAM",
|
||||
)
|
||||
|
||||
# Top largest symbols in the binary
|
||||
self._add_top_symbols(lines)
|
||||
|
||||
# Add ESPHome core detailed analysis if there are core symbols
|
||||
if self._esphome_core_symbols:
|
||||
self._add_section_header(lines, f"{_COMPONENT_CORE} Detailed Analysis")
|
||||
@@ -518,10 +431,6 @@ class MemoryAnalyzerCLI(MemoryAnalyzer):
|
||||
lines.append(f" ... and {len(large_ram_syms) - 10} more")
|
||||
lines.append("")
|
||||
|
||||
# CSWTCH (GCC switch table) analysis
|
||||
if self._cswtch_symbols:
|
||||
self._add_cswtch_analysis(lines)
|
||||
|
||||
lines.append(
|
||||
"Note: This analysis covers symbols in the ELF file. Some runtime allocations may not be included."
|
||||
)
|
||||
@@ -529,28 +438,6 @@ class MemoryAnalyzerCLI(MemoryAnalyzer):
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
def to_json(self) -> str:
|
||||
"""Export analysis results as JSON."""
|
||||
data = {
|
||||
"components": {
|
||||
name: {
|
||||
"text": mem.text_size,
|
||||
"rodata": mem.rodata_size,
|
||||
"data": mem.data_size,
|
||||
"bss": mem.bss_size,
|
||||
"flash_total": mem.flash_total,
|
||||
"ram_total": mem.ram_total,
|
||||
"symbol_count": mem.symbol_count,
|
||||
}
|
||||
for name, mem in self.components.items()
|
||||
},
|
||||
"totals": {
|
||||
"flash": sum(c.flash_total for c in self.components.values()),
|
||||
"ram": sum(c.ram_total for c in self.components.values()),
|
||||
},
|
||||
}
|
||||
return json.dumps(data, indent=2)
|
||||
|
||||
def dump_uncategorized_symbols(self, output_file: str | None = None) -> None:
|
||||
"""Dump uncategorized symbols for analysis."""
|
||||
# Sort by size descending
|
||||
|
||||
@@ -9,61 +9,20 @@ ESPHOME_COMPONENT_PATTERN = re.compile(r"esphome::([a-zA-Z0-9_]+)::")
|
||||
# Maps standard section names to their various platform-specific variants
|
||||
# Note: Order matters! More specific patterns (.bss) must come before general ones (.dram)
|
||||
# because ESP-IDF uses names like ".dram0.bss" which would match ".dram" otherwise
|
||||
#
|
||||
# Platform-specific sections:
|
||||
# - ESP8266/ESP32: .iram*, .dram*
|
||||
# - LibreTiny RTL87xx: .xip.code_* (flash), .ram.code_* (RAM)
|
||||
# - LibreTiny BK7231: .itcm.code (fast RAM), .vectors (interrupt vectors)
|
||||
# - LibreTiny LN882X: .flash_text, .flash_copy* (flash code)
|
||||
# - Zephyr/nRF52: text, rodata, datas, bss (no leading dots)
|
||||
SECTION_MAPPING = {
|
||||
".text": frozenset(
|
||||
[
|
||||
".text",
|
||||
".iram",
|
||||
# LibreTiny RTL87xx XIP (eXecute In Place) flash code
|
||||
".xip.code",
|
||||
# LibreTiny RTL87xx RAM code
|
||||
".ram.code_text",
|
||||
# LibreTiny BK7231 fast RAM code and vectors
|
||||
".itcm.code",
|
||||
".vectors",
|
||||
# LibreTiny LN882X flash code
|
||||
".flash_text",
|
||||
".flash_copy",
|
||||
# Zephyr/nRF52 sections (no leading dots)
|
||||
"text",
|
||||
"rom_start",
|
||||
]
|
||||
),
|
||||
".rodata": frozenset(
|
||||
[
|
||||
".rodata",
|
||||
# LibreTiny RTL87xx read-only data in RAM
|
||||
".ram.code_rodata",
|
||||
# Zephyr/nRF52 sections (no leading dots)
|
||||
"rodata",
|
||||
]
|
||||
),
|
||||
# .bss patterns - must be before .data to catch ".dram0.bss"
|
||||
".bss": frozenset(
|
||||
[
|
||||
".bss",
|
||||
# LibreTiny LN882X BSS
|
||||
".bss_ram",
|
||||
# Zephyr/nRF52 sections (no leading dots)
|
||||
"bss",
|
||||
"noinit",
|
||||
]
|
||||
),
|
||||
".data": frozenset(
|
||||
[
|
||||
".data",
|
||||
".dram",
|
||||
# Zephyr/nRF52 sections (no leading dots)
|
||||
"datas",
|
||||
]
|
||||
),
|
||||
".text": frozenset([".text", ".iram"]),
|
||||
".rodata": frozenset([".rodata"]),
|
||||
".bss": frozenset([".bss"]), # Must be before .data to catch ".dram0.bss"
|
||||
".data": frozenset([".data", ".dram"]),
|
||||
}
|
||||
|
||||
# Section to ComponentMemory attribute mapping
|
||||
# Maps section names to the attribute name in ComponentMemory dataclass
|
||||
SECTION_TO_ATTR = {
|
||||
".text": "text_size",
|
||||
".rodata": "rodata_size",
|
||||
".data": "data_size",
|
||||
".bss": "bss_size",
|
||||
}
|
||||
|
||||
# Component identification rules
|
||||
@@ -504,9 +463,7 @@ SYMBOL_PATTERNS = {
|
||||
"__FUNCTION__$",
|
||||
"DAYS_IN_MONTH",
|
||||
"_DAYS_BEFORE_MONTH",
|
||||
# Note: CSWTCH$ symbols are GCC switch table lookup tables.
|
||||
# They are attributed to their source object files via _analyze_cswtch_symbols()
|
||||
# rather than being lumped into libc.
|
||||
"CSWTCH$",
|
||||
"dst$",
|
||||
"sulp",
|
||||
"_strtol_l", # String to long with locale
|
||||
|
||||
@@ -94,13 +94,13 @@ def parse_symbol_line(line: str) -> tuple[str, str, int, str] | None:
|
||||
return None
|
||||
|
||||
# Find section, size, and name
|
||||
# Try each part as a potential section name
|
||||
for i, part in enumerate(parts):
|
||||
# Skip parts that are clearly flags, addresses, or other metadata
|
||||
# Sections start with '.' (standard ELF) or are known section names (Zephyr)
|
||||
if not part.startswith("."):
|
||||
continue
|
||||
|
||||
section = map_section_name(part)
|
||||
if not section:
|
||||
continue
|
||||
break
|
||||
|
||||
# Need at least size field after section
|
||||
if i + 1 >= len(parts):
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import os
|
||||
from pathlib import Path
|
||||
import subprocess
|
||||
from typing import TYPE_CHECKING
|
||||
@@ -18,82 +17,10 @@ TOOLCHAIN_PREFIXES = [
|
||||
"xtensa-lx106-elf-", # ESP8266
|
||||
"xtensa-esp32-elf-", # ESP32
|
||||
"xtensa-esp-elf-", # ESP32 (newer IDF)
|
||||
"arm-zephyr-eabi-", # nRF52/Zephyr SDK
|
||||
"arm-none-eabi-", # Generic ARM (RP2040, etc.)
|
||||
"", # System default (no prefix)
|
||||
]
|
||||
|
||||
|
||||
def _find_in_platformio_packages(tool_name: str) -> str | None:
|
||||
"""Search for a tool in PlatformIO package directories.
|
||||
|
||||
This handles cases like Zephyr SDK where tools are installed in nested
|
||||
directories that aren't in PATH.
|
||||
|
||||
Args:
|
||||
tool_name: Name of the tool (e.g., "readelf", "objdump")
|
||||
|
||||
Returns:
|
||||
Full path to the tool or None if not found
|
||||
"""
|
||||
# Get PlatformIO packages directory
|
||||
platformio_home = Path(os.path.expanduser("~/.platformio/packages"))
|
||||
if not platformio_home.exists():
|
||||
return None
|
||||
|
||||
# Search patterns for toolchains that might contain the tool
|
||||
# Order matters - more specific patterns first
|
||||
search_patterns = [
|
||||
# Zephyr SDK deeply nested structure (4 levels)
|
||||
# e.g., toolchain-gccarmnoneeabi/zephyr-sdk-0.17.4/arm-zephyr-eabi/bin/arm-zephyr-eabi-objdump
|
||||
f"toolchain-*/*/*/bin/*-{tool_name}",
|
||||
# Zephyr SDK nested structure (3 levels)
|
||||
f"toolchain-*/*/bin/*-{tool_name}",
|
||||
f"toolchain-*/bin/*-{tool_name}",
|
||||
# Standard PlatformIO toolchain structure
|
||||
f"toolchain-*/bin/*{tool_name}",
|
||||
]
|
||||
|
||||
for pattern in search_patterns:
|
||||
matches = list(platformio_home.glob(pattern))
|
||||
if matches:
|
||||
# Sort to get consistent results, prefer arm-zephyr-eabi over arm-none-eabi
|
||||
matches.sort(key=lambda p: ("zephyr" not in str(p), str(p)))
|
||||
tool_path = str(matches[0])
|
||||
_LOGGER.debug("Found %s in PlatformIO packages: %s", tool_name, tool_path)
|
||||
return tool_path
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def resolve_tool_path(
|
||||
tool_name: str,
|
||||
derived_path: str | None,
|
||||
objdump_path: str | None = None,
|
||||
) -> str | None:
|
||||
"""Resolve a tool path, falling back to find_tool if derived path doesn't exist.
|
||||
|
||||
Args:
|
||||
tool_name: Name of the tool (e.g., "objdump", "readelf")
|
||||
derived_path: Path derived from idedata (may not exist for some platforms)
|
||||
objdump_path: Path to objdump binary to derive other tool paths from
|
||||
|
||||
Returns:
|
||||
Resolved path to the tool, or the original derived_path if it exists
|
||||
"""
|
||||
if derived_path and not Path(derived_path).exists():
|
||||
found = find_tool(tool_name, objdump_path)
|
||||
if found:
|
||||
_LOGGER.debug(
|
||||
"Derived %s path %s not found, using %s",
|
||||
tool_name,
|
||||
derived_path,
|
||||
found,
|
||||
)
|
||||
return found
|
||||
return derived_path
|
||||
|
||||
|
||||
def find_tool(
|
||||
tool_name: str,
|
||||
objdump_path: str | None = None,
|
||||
@@ -101,8 +28,7 @@ def find_tool(
|
||||
"""Find a toolchain tool by name.
|
||||
|
||||
First tries to derive the tool path from objdump_path (if provided),
|
||||
then searches PlatformIO package directories (for cross-compile toolchains),
|
||||
and finally falls back to searching for platform-specific tools in PATH.
|
||||
then falls back to searching for platform-specific tools.
|
||||
|
||||
Args:
|
||||
tool_name: Name of the tool (e.g., "objdump", "nm", "c++filt")
|
||||
@@ -121,13 +47,7 @@ def find_tool(
|
||||
_LOGGER.debug("Found %s at: %s", tool_name, potential_path)
|
||||
return potential_path
|
||||
|
||||
# Search in PlatformIO packages directory first (handles Zephyr SDK, etc.)
|
||||
# This must come before PATH search because system tools (e.g., /usr/bin/objdump)
|
||||
# are for the host architecture, not the target (ARM, Xtensa, etc.)
|
||||
if found := _find_in_platformio_packages(tool_name):
|
||||
return found
|
||||
|
||||
# Try platform-specific tools in PATH (fallback for when tools are installed globally)
|
||||
# Try platform-specific tools
|
||||
for prefix in TOOLCHAIN_PREFIXES:
|
||||
cmd = f"{prefix}{tool_name}"
|
||||
try:
|
||||
|
||||
@@ -1,139 +0,0 @@
|
||||
"""ESP-IDF direct build generator for ESPHome."""
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
from esphome.components.esp32 import get_esp32_variant
|
||||
from esphome.core import CORE
|
||||
from esphome.helpers import mkdir_p, write_file_if_changed
|
||||
|
||||
|
||||
def get_available_components() -> list[str] | None:
|
||||
"""Get list of available ESP-IDF components from project_description.json.
|
||||
|
||||
Returns only internal ESP-IDF components, excluding external/managed
|
||||
components (from idf_component.yml).
|
||||
"""
|
||||
project_desc = Path(CORE.build_path) / "build" / "project_description.json"
|
||||
if not project_desc.exists():
|
||||
return None
|
||||
|
||||
try:
|
||||
with open(project_desc, encoding="utf-8") as f:
|
||||
data = json.load(f)
|
||||
|
||||
component_info = data.get("build_component_info", {})
|
||||
|
||||
result = []
|
||||
for name, info in component_info.items():
|
||||
# Exclude our own src component
|
||||
if name == "src":
|
||||
continue
|
||||
|
||||
# Exclude managed/external components
|
||||
comp_dir = info.get("dir", "")
|
||||
if "managed_components" in comp_dir:
|
||||
continue
|
||||
|
||||
result.append(name)
|
||||
|
||||
return result
|
||||
except (json.JSONDecodeError, OSError):
|
||||
return None
|
||||
|
||||
|
||||
def has_discovered_components() -> bool:
|
||||
"""Check if we have discovered components from a previous configure."""
|
||||
return get_available_components() is not None
|
||||
|
||||
|
||||
def get_project_cmakelists() -> str:
|
||||
"""Generate the top-level CMakeLists.txt for ESP-IDF project."""
|
||||
# Get IDF target from ESP32 variant (e.g., ESP32S3 -> esp32s3)
|
||||
variant = get_esp32_variant()
|
||||
idf_target = variant.lower().replace("-", "")
|
||||
|
||||
return f"""\
|
||||
# Auto-generated by ESPHome
|
||||
cmake_minimum_required(VERSION 3.16)
|
||||
|
||||
set(IDF_TARGET {idf_target})
|
||||
set(EXTRA_COMPONENT_DIRS ${{CMAKE_SOURCE_DIR}}/src)
|
||||
|
||||
include($ENV{{IDF_PATH}}/tools/cmake/project.cmake)
|
||||
project({CORE.name})
|
||||
"""
|
||||
|
||||
|
||||
def get_component_cmakelists(minimal: bool = False) -> str:
|
||||
"""Generate the main component CMakeLists.txt."""
|
||||
idf_requires = [] if minimal else (get_available_components() or [])
|
||||
requires_str = " ".join(idf_requires)
|
||||
|
||||
# Extract compile definitions from build flags (-DXXX -> XXX)
|
||||
compile_defs = [flag[2:] for flag in CORE.build_flags if flag.startswith("-D")]
|
||||
compile_defs_str = "\n ".join(compile_defs) if compile_defs else ""
|
||||
|
||||
# Extract compile options (-W flags, excluding linker flags)
|
||||
compile_opts = [
|
||||
flag
|
||||
for flag in CORE.build_flags
|
||||
if flag.startswith("-W") and not flag.startswith("-Wl,")
|
||||
]
|
||||
compile_opts_str = "\n ".join(compile_opts) if compile_opts else ""
|
||||
|
||||
# Extract linker options (-Wl, flags)
|
||||
link_opts = [flag for flag in CORE.build_flags if flag.startswith("-Wl,")]
|
||||
link_opts_str = "\n ".join(link_opts) if link_opts else ""
|
||||
|
||||
return f"""\
|
||||
# Auto-generated by ESPHome
|
||||
file(GLOB_RECURSE app_sources
|
||||
"${{CMAKE_CURRENT_SOURCE_DIR}}/*.cpp"
|
||||
"${{CMAKE_CURRENT_SOURCE_DIR}}/*.c"
|
||||
"${{CMAKE_CURRENT_SOURCE_DIR}}/esphome/*.cpp"
|
||||
"${{CMAKE_CURRENT_SOURCE_DIR}}/esphome/*.c"
|
||||
)
|
||||
|
||||
idf_component_register(
|
||||
SRCS ${{app_sources}}
|
||||
INCLUDE_DIRS "." "esphome"
|
||||
REQUIRES {requires_str}
|
||||
)
|
||||
|
||||
# Apply C++ standard
|
||||
target_compile_features(${{COMPONENT_LIB}} PUBLIC cxx_std_20)
|
||||
|
||||
# ESPHome compile definitions
|
||||
target_compile_definitions(${{COMPONENT_LIB}} PUBLIC
|
||||
{compile_defs_str}
|
||||
)
|
||||
|
||||
# ESPHome compile options
|
||||
target_compile_options(${{COMPONENT_LIB}} PUBLIC
|
||||
{compile_opts_str}
|
||||
)
|
||||
|
||||
# ESPHome linker options
|
||||
target_link_options(${{COMPONENT_LIB}} PUBLIC
|
||||
{link_opts_str}
|
||||
)
|
||||
"""
|
||||
|
||||
|
||||
def write_project(minimal: bool = False) -> None:
|
||||
"""Write ESP-IDF project files."""
|
||||
mkdir_p(CORE.build_path)
|
||||
mkdir_p(CORE.relative_src_path())
|
||||
|
||||
# Write top-level CMakeLists.txt
|
||||
write_file_if_changed(
|
||||
CORE.relative_build_path("CMakeLists.txt"),
|
||||
get_project_cmakelists(),
|
||||
)
|
||||
|
||||
# Write component CMakeLists.txt in src/
|
||||
write_file_if_changed(
|
||||
CORE.relative_src_path("CMakeLists.txt"),
|
||||
get_component_cmakelists(minimal=minimal),
|
||||
)
|
||||
@@ -69,7 +69,6 @@ from esphome.cpp_types import ( # noqa: F401
|
||||
JsonObjectConst,
|
||||
Parented,
|
||||
PollingComponent,
|
||||
StringRef,
|
||||
arduino_json_ns,
|
||||
bool_,
|
||||
const_char_ptr,
|
||||
@@ -87,7 +86,6 @@ from esphome.cpp_types import ( # noqa: F401
|
||||
size_t,
|
||||
std_ns,
|
||||
std_shared_ptr,
|
||||
std_span,
|
||||
std_string,
|
||||
std_string_ref,
|
||||
std_vector,
|
||||
|
||||
@@ -45,6 +45,8 @@ void AbsoluteHumidityComponent::dump_config() {
|
||||
this->temperature_sensor_->get_name().c_str(), this->humidity_sensor_->get_name().c_str());
|
||||
}
|
||||
|
||||
float AbsoluteHumidityComponent::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
void AbsoluteHumidityComponent::loop() {
|
||||
if (!this->next_update_) {
|
||||
return;
|
||||
|
||||
@@ -24,6 +24,7 @@ class AbsoluteHumidityComponent : public sensor::Sensor, public Component {
|
||||
|
||||
void setup() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
void loop() override;
|
||||
|
||||
protected:
|
||||
|
||||
@@ -68,6 +68,11 @@ class ADCSensor : public sensor::Sensor, public PollingComponent, public voltage
|
||||
/// This method is called during the ESPHome setup process to log the configuration.
|
||||
void dump_config() override;
|
||||
|
||||
/// Return the setup priority for this component.
|
||||
/// Components with higher priority are initialized earlier during setup.
|
||||
/// @return A float representing the setup priority.
|
||||
float get_setup_priority() const override;
|
||||
|
||||
#ifdef USE_ZEPHYR
|
||||
/// Set the ADC channel to be used by the ADC sensor.
|
||||
/// @param channel Pointer to an adc_dt_spec structure representing the ADC channel.
|
||||
|
||||
@@ -79,5 +79,7 @@ void ADCSensor::set_sample_count(uint8_t sample_count) {
|
||||
|
||||
void ADCSensor::set_sampling_mode(SamplingMode sampling_mode) { this->sampling_mode_ = sampling_mode; }
|
||||
|
||||
float ADCSensor::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
} // namespace adc
|
||||
} // namespace esphome
|
||||
|
||||
@@ -42,11 +42,11 @@ void ADCSensor::setup() {
|
||||
adc_oneshot_unit_init_cfg_t init_config = {}; // Zero initialize
|
||||
init_config.unit_id = this->adc_unit_;
|
||||
init_config.ulp_mode = ADC_ULP_MODE_DISABLE;
|
||||
#if USE_ESP32_VARIANT_ESP32C2 || USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || \
|
||||
USE_ESP32_VARIANT_ESP32C6 || USE_ESP32_VARIANT_ESP32C61 || USE_ESP32_VARIANT_ESP32H2
|
||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
||||
USE_ESP32_VARIANT_ESP32C61 || USE_ESP32_VARIANT_ESP32H2
|
||||
init_config.clk_src = ADC_DIGI_CLK_SRC_DEFAULT;
|
||||
#endif // USE_ESP32_VARIANT_ESP32C2 || USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 ||
|
||||
// USE_ESP32_VARIANT_ESP32C6 || USE_ESP32_VARIANT_ESP32C61 || USE_ESP32_VARIANT_ESP32H2
|
||||
#endif // USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 ||
|
||||
// USE_ESP32_VARIANT_ESP32C61 || USE_ESP32_VARIANT_ESP32H2
|
||||
esp_err_t err = adc_oneshot_new_unit(&init_config, &ADCSensor::shared_adc_handles[this->adc_unit_]);
|
||||
if (err != ESP_OK) {
|
||||
ESP_LOGE(TAG, "Error initializing %s: %d", LOG_STR_ARG(adc_unit_to_str(this->adc_unit_)), err);
|
||||
@@ -76,7 +76,7 @@ void ADCSensor::setup() {
|
||||
|
||||
#if USE_ESP32_VARIANT_ESP32C3 || USE_ESP32_VARIANT_ESP32C5 || USE_ESP32_VARIANT_ESP32C6 || \
|
||||
USE_ESP32_VARIANT_ESP32C61 || USE_ESP32_VARIANT_ESP32H2 || USE_ESP32_VARIANT_ESP32P4 || USE_ESP32_VARIANT_ESP32S3
|
||||
// RISC-V variants (except C2) and S3 use curve fitting calibration
|
||||
// RISC-V variants and S3 use curve fitting calibration
|
||||
adc_cali_curve_fitting_config_t cali_config = {}; // Zero initialize first
|
||||
#if ESP_IDF_VERSION >= ESP_IDF_VERSION_VAL(5, 3, 0)
|
||||
cali_config.chan = this->channel_;
|
||||
@@ -94,14 +94,14 @@ void ADCSensor::setup() {
|
||||
ESP_LOGW(TAG, "Curve fitting calibration failed with error %d, will use uncalibrated readings", err);
|
||||
this->setup_flags_.calibration_complete = false;
|
||||
}
|
||||
#else // ESP32, ESP32-S2, and ESP32-C2 use line fitting calibration
|
||||
#else // Other ESP32 variants use line fitting calibration
|
||||
adc_cali_line_fitting_config_t cali_config = {
|
||||
.unit_id = this->adc_unit_,
|
||||
.atten = this->attenuation_,
|
||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
||||
#if !defined(USE_ESP32_VARIANT_ESP32S2) && !defined(USE_ESP32_VARIANT_ESP32C2)
|
||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
||||
.default_vref = 1100, // Default reference voltage in mV
|
||||
#endif // !defined(USE_ESP32_VARIANT_ESP32S2) && !defined(USE_ESP32_VARIANT_ESP32C2)
|
||||
#endif // !defined(USE_ESP32_VARIANT_ESP32S2)
|
||||
};
|
||||
err = adc_cali_create_scheme_line_fitting(&cali_config, &handle);
|
||||
if (err == ESP_OK) {
|
||||
@@ -112,7 +112,7 @@ void ADCSensor::setup() {
|
||||
ESP_LOGW(TAG, "Line fitting calibration failed with error %d, will use uncalibrated readings", err);
|
||||
this->setup_flags_.calibration_complete = false;
|
||||
}
|
||||
#endif // ESP32C3 || ESP32C5 || ESP32C6 || ESP32C61 || ESP32H2 || ESP32P4 || ESP32S3
|
||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32C61 || ESP32H2 || ESP32P4 || ESP32S3
|
||||
}
|
||||
|
||||
this->setup_flags_.init_complete = true;
|
||||
@@ -189,7 +189,7 @@ float ADCSensor::sample_fixed_attenuation_() {
|
||||
adc_cali_delete_scheme_curve_fitting(this->calibration_handle_);
|
||||
#else // Other ESP32 variants use line fitting calibration
|
||||
adc_cali_delete_scheme_line_fitting(this->calibration_handle_);
|
||||
#endif // ESP32C3 || ESP32C5 || ESP32C6 || ESP32C61 || ESP32H2 || ESP32P4 || ESP32S3
|
||||
#endif // USE_ESP32_VARIANT_ESP32C3 || ESP32C5 || ESP32C6 || ESP32C61 || ESP32H2 || ESP32P4 || ESP32S3
|
||||
this->calibration_handle_ = nullptr;
|
||||
}
|
||||
}
|
||||
@@ -247,7 +247,7 @@ float ADCSensor::sample_autorange_() {
|
||||
.unit_id = this->adc_unit_,
|
||||
.atten = atten,
|
||||
.bitwidth = ADC_BITWIDTH_DEFAULT,
|
||||
#if !defined(USE_ESP32_VARIANT_ESP32S2) && !defined(USE_ESP32_VARIANT_ESP32C2)
|
||||
#if !defined(USE_ESP32_VARIANT_ESP32S2)
|
||||
.default_vref = 1100,
|
||||
#endif
|
||||
};
|
||||
|
||||
@@ -2,7 +2,7 @@ import logging
|
||||
|
||||
import esphome.codegen as cg
|
||||
from esphome.components import sensor, voltage_sampler
|
||||
from esphome.components.esp32 import get_esp32_variant, include_builtin_idf_component
|
||||
from esphome.components.esp32 import get_esp32_variant
|
||||
from esphome.components.nrf52.const import AIN_TO_GPIO, EXTRA_ADC
|
||||
from esphome.components.zephyr import (
|
||||
zephyr_add_overlay,
|
||||
@@ -118,9 +118,6 @@ async def to_code(config):
|
||||
cg.add(var.set_sampling_mode(config[CONF_SAMPLING_MODE]))
|
||||
|
||||
if CORE.is_esp32:
|
||||
# Re-enable ESP-IDF's ADC driver (excluded by default to save compile time)
|
||||
include_builtin_idf_component("esp_adc")
|
||||
|
||||
if attenuation := config.get(CONF_ATTENUATION):
|
||||
if attenuation == "auto":
|
||||
cg.add(var.set_autorange(cg.global_ns.true))
|
||||
@@ -163,21 +160,21 @@ async def to_code(config):
|
||||
zephyr_add_user("io-channels", f"<&adc {channel_id}>")
|
||||
zephyr_add_overlay(
|
||||
f"""
|
||||
&adc {{
|
||||
#address-cells = <1>;
|
||||
#size-cells = <0>;
|
||||
&adc {{
|
||||
#address-cells = <1>;
|
||||
#size-cells = <0>;
|
||||
|
||||
channel@{channel_id} {{
|
||||
reg = <{channel_id}>;
|
||||
zephyr,gain = "{gain}";
|
||||
zephyr,reference = "ADC_REF_INTERNAL";
|
||||
zephyr,acquisition-time = <ADC_ACQ_TIME_DEFAULT>;
|
||||
zephyr,input-positive = <NRF_SAADC_{pin_number}>;
|
||||
zephyr,resolution = <14>;
|
||||
zephyr,oversampling = <8>;
|
||||
}};
|
||||
}};
|
||||
"""
|
||||
channel@{channel_id} {{
|
||||
reg = <{channel_id}>;
|
||||
zephyr,gain = "{gain}";
|
||||
zephyr,reference = "ADC_REF_INTERNAL";
|
||||
zephyr,acquisition-time = <ADC_ACQ_TIME_DEFAULT>;
|
||||
zephyr,input-positive = <NRF_SAADC_{pin_number}>;
|
||||
zephyr,resolution = <14>;
|
||||
zephyr,oversampling = <8>;
|
||||
}};
|
||||
}};
|
||||
"""
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -9,6 +9,8 @@ static const char *const TAG = "adc128s102.sensor";
|
||||
|
||||
ADC128S102Sensor::ADC128S102Sensor(uint8_t channel) : channel_(channel) {}
|
||||
|
||||
float ADC128S102Sensor::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
void ADC128S102Sensor::dump_config() {
|
||||
LOG_SENSOR("", "ADC128S102 Sensor", this);
|
||||
ESP_LOGCONFIG(TAG, " Pin: %u", this->channel_);
|
||||
|
||||
@@ -19,6 +19,7 @@ class ADC128S102Sensor : public PollingComponent,
|
||||
|
||||
void update() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
float sample() override;
|
||||
|
||||
protected:
|
||||
|
||||
@@ -150,6 +150,8 @@ void AHT10Component::update() {
|
||||
this->restart_read_();
|
||||
}
|
||||
|
||||
float AHT10Component::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
void AHT10Component::dump_config() {
|
||||
ESP_LOGCONFIG(TAG, "AHT10:");
|
||||
LOG_I2C_DEVICE(this);
|
||||
|
||||
@@ -16,6 +16,7 @@ class AHT10Component : public PollingComponent, public i2c::I2CDevice {
|
||||
void setup() override;
|
||||
void update() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
void set_variant(AHT10Variant variant) { this->variant_ = variant; }
|
||||
|
||||
void set_temperature_sensor(sensor::Sensor *temperature_sensor) { temperature_sensor_ = temperature_sensor; }
|
||||
|
||||
@@ -31,8 +31,7 @@ void AlarmControlPanel::publish_state(AlarmControlPanelState state) {
|
||||
this->last_update_ = millis();
|
||||
if (state != this->current_state_) {
|
||||
auto prev_state = this->current_state_;
|
||||
ESP_LOGD(TAG, "'%s' >> %s (was %s)", this->get_name().c_str(),
|
||||
LOG_STR_ARG(alarm_control_panel_state_to_string(state)),
|
||||
ESP_LOGD(TAG, "Set state to: %s, previous: %s", LOG_STR_ARG(alarm_control_panel_state_to_string(state)),
|
||||
LOG_STR_ARG(alarm_control_panel_state_to_string(prev_state)));
|
||||
this->current_state_ = state;
|
||||
// Single state callback - triggers check get_state() for specific states
|
||||
@@ -67,29 +66,52 @@ void AlarmControlPanel::add_on_ready_callback(std::function<void()> &&callback)
|
||||
this->ready_callback_.add(std::move(callback));
|
||||
}
|
||||
|
||||
void AlarmControlPanel::arm_with_code_(AlarmControlPanelCall &(AlarmControlPanelCall::*arm_method)(),
|
||||
const char *code) {
|
||||
void AlarmControlPanel::arm_away(optional<std::string> code) {
|
||||
auto call = this->make_call();
|
||||
(call.*arm_method)();
|
||||
if (code != nullptr)
|
||||
call.set_code(code);
|
||||
call.arm_away();
|
||||
if (code.has_value())
|
||||
call.set_code(code.value());
|
||||
call.perform();
|
||||
}
|
||||
|
||||
void AlarmControlPanel::arm_away(const char *code) { this->arm_with_code_(&AlarmControlPanelCall::arm_away, code); }
|
||||
|
||||
void AlarmControlPanel::arm_home(const char *code) { this->arm_with_code_(&AlarmControlPanelCall::arm_home, code); }
|
||||
|
||||
void AlarmControlPanel::arm_night(const char *code) { this->arm_with_code_(&AlarmControlPanelCall::arm_night, code); }
|
||||
|
||||
void AlarmControlPanel::arm_vacation(const char *code) {
|
||||
this->arm_with_code_(&AlarmControlPanelCall::arm_vacation, code);
|
||||
void AlarmControlPanel::arm_home(optional<std::string> code) {
|
||||
auto call = this->make_call();
|
||||
call.arm_home();
|
||||
if (code.has_value())
|
||||
call.set_code(code.value());
|
||||
call.perform();
|
||||
}
|
||||
|
||||
void AlarmControlPanel::arm_custom_bypass(const char *code) {
|
||||
this->arm_with_code_(&AlarmControlPanelCall::arm_custom_bypass, code);
|
||||
void AlarmControlPanel::arm_night(optional<std::string> code) {
|
||||
auto call = this->make_call();
|
||||
call.arm_night();
|
||||
if (code.has_value())
|
||||
call.set_code(code.value());
|
||||
call.perform();
|
||||
}
|
||||
|
||||
void AlarmControlPanel::disarm(const char *code) { this->arm_with_code_(&AlarmControlPanelCall::disarm, code); }
|
||||
void AlarmControlPanel::arm_vacation(optional<std::string> code) {
|
||||
auto call = this->make_call();
|
||||
call.arm_vacation();
|
||||
if (code.has_value())
|
||||
call.set_code(code.value());
|
||||
call.perform();
|
||||
}
|
||||
|
||||
void AlarmControlPanel::arm_custom_bypass(optional<std::string> code) {
|
||||
auto call = this->make_call();
|
||||
call.arm_custom_bypass();
|
||||
if (code.has_value())
|
||||
call.set_code(code.value());
|
||||
call.perform();
|
||||
}
|
||||
|
||||
void AlarmControlPanel::disarm(optional<std::string> code) {
|
||||
auto call = this->make_call();
|
||||
call.disarm();
|
||||
if (code.has_value())
|
||||
call.set_code(code.value());
|
||||
call.perform();
|
||||
}
|
||||
|
||||
} // namespace esphome::alarm_control_panel
|
||||
|
||||
@@ -76,53 +76,37 @@ class AlarmControlPanel : public EntityBase {
|
||||
*
|
||||
* @param code The code
|
||||
*/
|
||||
void arm_away(const char *code = nullptr);
|
||||
void arm_away(const optional<std::string> &code) {
|
||||
this->arm_away(code.has_value() ? code.value().c_str() : nullptr);
|
||||
}
|
||||
void arm_away(optional<std::string> code = nullopt);
|
||||
|
||||
/** arm the alarm in home mode
|
||||
*
|
||||
* @param code The code
|
||||
*/
|
||||
void arm_home(const char *code = nullptr);
|
||||
void arm_home(const optional<std::string> &code) {
|
||||
this->arm_home(code.has_value() ? code.value().c_str() : nullptr);
|
||||
}
|
||||
void arm_home(optional<std::string> code = nullopt);
|
||||
|
||||
/** arm the alarm in night mode
|
||||
*
|
||||
* @param code The code
|
||||
*/
|
||||
void arm_night(const char *code = nullptr);
|
||||
void arm_night(const optional<std::string> &code) {
|
||||
this->arm_night(code.has_value() ? code.value().c_str() : nullptr);
|
||||
}
|
||||
void arm_night(optional<std::string> code = nullopt);
|
||||
|
||||
/** arm the alarm in vacation mode
|
||||
*
|
||||
* @param code The code
|
||||
*/
|
||||
void arm_vacation(const char *code = nullptr);
|
||||
void arm_vacation(const optional<std::string> &code) {
|
||||
this->arm_vacation(code.has_value() ? code.value().c_str() : nullptr);
|
||||
}
|
||||
void arm_vacation(optional<std::string> code = nullopt);
|
||||
|
||||
/** arm the alarm in custom bypass mode
|
||||
*
|
||||
* @param code The code
|
||||
*/
|
||||
void arm_custom_bypass(const char *code = nullptr);
|
||||
void arm_custom_bypass(const optional<std::string> &code) {
|
||||
this->arm_custom_bypass(code.has_value() ? code.value().c_str() : nullptr);
|
||||
}
|
||||
void arm_custom_bypass(optional<std::string> code = nullopt);
|
||||
|
||||
/** disarm the alarm
|
||||
*
|
||||
* @param code The code
|
||||
*/
|
||||
void disarm(const char *code = nullptr);
|
||||
void disarm(const optional<std::string> &code) { this->disarm(code.has_value() ? code.value().c_str() : nullptr); }
|
||||
void disarm(optional<std::string> code = nullopt);
|
||||
|
||||
/** Get the state
|
||||
*
|
||||
@@ -134,8 +118,6 @@ class AlarmControlPanel : public EntityBase {
|
||||
|
||||
protected:
|
||||
friend AlarmControlPanelCall;
|
||||
// Helper to reduce code duplication for arm/disarm methods
|
||||
void arm_with_code_(AlarmControlPanelCall &(AlarmControlPanelCall::*arm_method)(), const char *code);
|
||||
// in order to store last panel state in flash
|
||||
ESPPreferenceObject pref_;
|
||||
// current state
|
||||
|
||||
@@ -10,10 +10,8 @@ static const char *const TAG = "alarm_control_panel";
|
||||
|
||||
AlarmControlPanelCall::AlarmControlPanelCall(AlarmControlPanel *parent) : parent_(parent) {}
|
||||
|
||||
AlarmControlPanelCall &AlarmControlPanelCall::set_code(const char *code) {
|
||||
if (code != nullptr) {
|
||||
this->code_ = std::string(code);
|
||||
}
|
||||
AlarmControlPanelCall &AlarmControlPanelCall::set_code(const std::string &code) {
|
||||
this->code_ = code;
|
||||
return *this;
|
||||
}
|
||||
|
||||
|
||||
@@ -14,8 +14,7 @@ class AlarmControlPanelCall {
|
||||
public:
|
||||
AlarmControlPanelCall(AlarmControlPanel *parent);
|
||||
|
||||
AlarmControlPanelCall &set_code(const char *code);
|
||||
AlarmControlPanelCall &set_code(const std::string &code) { return this->set_code(code.c_str()); }
|
||||
AlarmControlPanelCall &set_code(const std::string &code);
|
||||
AlarmControlPanelCall &arm_away();
|
||||
AlarmControlPanelCall &arm_home();
|
||||
AlarmControlPanelCall &arm_night();
|
||||
|
||||
@@ -1,15 +1,32 @@
|
||||
#include "alarm_control_panel_state.h"
|
||||
#include "esphome/core/progmem.h"
|
||||
|
||||
namespace esphome::alarm_control_panel {
|
||||
|
||||
// Alarm control panel state strings indexed by AlarmControlPanelState enum (0-9)
|
||||
PROGMEM_STRING_TABLE(AlarmControlPanelStateStrings, "DISARMED", "ARMED_HOME", "ARMED_AWAY", "ARMED_NIGHT",
|
||||
"ARMED_VACATION", "ARMED_CUSTOM_BYPASS", "PENDING", "ARMING", "DISARMING", "TRIGGERED", "UNKNOWN");
|
||||
|
||||
const LogString *alarm_control_panel_state_to_string(AlarmControlPanelState state) {
|
||||
return AlarmControlPanelStateStrings::get_log_str(static_cast<uint8_t>(state),
|
||||
AlarmControlPanelStateStrings::LAST_INDEX);
|
||||
switch (state) {
|
||||
case ACP_STATE_DISARMED:
|
||||
return LOG_STR("DISARMED");
|
||||
case ACP_STATE_ARMED_HOME:
|
||||
return LOG_STR("ARMED_HOME");
|
||||
case ACP_STATE_ARMED_AWAY:
|
||||
return LOG_STR("ARMED_AWAY");
|
||||
case ACP_STATE_ARMED_NIGHT:
|
||||
return LOG_STR("ARMED_NIGHT");
|
||||
case ACP_STATE_ARMED_VACATION:
|
||||
return LOG_STR("ARMED_VACATION");
|
||||
case ACP_STATE_ARMED_CUSTOM_BYPASS:
|
||||
return LOG_STR("ARMED_CUSTOM_BYPASS");
|
||||
case ACP_STATE_PENDING:
|
||||
return LOG_STR("PENDING");
|
||||
case ACP_STATE_ARMING:
|
||||
return LOG_STR("ARMING");
|
||||
case ACP_STATE_DISARMING:
|
||||
return LOG_STR("DISARMING");
|
||||
case ACP_STATE_TRIGGERED:
|
||||
return LOG_STR("TRIGGERED");
|
||||
default:
|
||||
return LOG_STR("UNKNOWN");
|
||||
}
|
||||
}
|
||||
|
||||
} // namespace esphome::alarm_control_panel
|
||||
|
||||
@@ -66,7 +66,15 @@ template<typename... Ts> class ArmAwayAction : public Action<Ts...> {
|
||||
|
||||
TEMPLATABLE_VALUE(std::string, code)
|
||||
|
||||
void play(const Ts &...x) override { this->alarm_control_panel_->arm_away(this->code_.optional_value(x...)); }
|
||||
void play(const Ts &...x) override {
|
||||
auto call = this->alarm_control_panel_->make_call();
|
||||
auto code = this->code_.optional_value(x...);
|
||||
if (code.has_value()) {
|
||||
call.set_code(code.value());
|
||||
}
|
||||
call.arm_away();
|
||||
call.perform();
|
||||
}
|
||||
|
||||
protected:
|
||||
AlarmControlPanel *alarm_control_panel_;
|
||||
@@ -78,7 +86,15 @@ template<typename... Ts> class ArmHomeAction : public Action<Ts...> {
|
||||
|
||||
TEMPLATABLE_VALUE(std::string, code)
|
||||
|
||||
void play(const Ts &...x) override { this->alarm_control_panel_->arm_home(this->code_.optional_value(x...)); }
|
||||
void play(const Ts &...x) override {
|
||||
auto call = this->alarm_control_panel_->make_call();
|
||||
auto code = this->code_.optional_value(x...);
|
||||
if (code.has_value()) {
|
||||
call.set_code(code.value());
|
||||
}
|
||||
call.arm_home();
|
||||
call.perform();
|
||||
}
|
||||
|
||||
protected:
|
||||
AlarmControlPanel *alarm_control_panel_;
|
||||
@@ -90,7 +106,15 @@ template<typename... Ts> class ArmNightAction : public Action<Ts...> {
|
||||
|
||||
TEMPLATABLE_VALUE(std::string, code)
|
||||
|
||||
void play(const Ts &...x) override { this->alarm_control_panel_->arm_night(this->code_.optional_value(x...)); }
|
||||
void play(const Ts &...x) override {
|
||||
auto call = this->alarm_control_panel_->make_call();
|
||||
auto code = this->code_.optional_value(x...);
|
||||
if (code.has_value()) {
|
||||
call.set_code(code.value());
|
||||
}
|
||||
call.arm_night();
|
||||
call.perform();
|
||||
}
|
||||
|
||||
protected:
|
||||
AlarmControlPanel *alarm_control_panel_;
|
||||
|
||||
@@ -176,5 +176,7 @@ void AM2315C::dump_config() {
|
||||
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
||||
}
|
||||
|
||||
float AM2315C::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
} // namespace am2315c
|
||||
} // namespace esphome
|
||||
|
||||
@@ -33,6 +33,7 @@ class AM2315C : public PollingComponent, public i2c::I2CDevice {
|
||||
void dump_config() override;
|
||||
void update() override;
|
||||
void setup() override;
|
||||
float get_setup_priority() const override;
|
||||
|
||||
void set_temperature_sensor(sensor::Sensor *temperature_sensor) { this->temperature_sensor_ = temperature_sensor; }
|
||||
void set_humidity_sensor(sensor::Sensor *humidity_sensor) { this->humidity_sensor_ = humidity_sensor; }
|
||||
|
||||
@@ -51,6 +51,7 @@ void AM2320Component::dump_config() {
|
||||
LOG_SENSOR(" ", "Temperature", this->temperature_sensor_);
|
||||
LOG_SENSOR(" ", "Humidity", this->humidity_sensor_);
|
||||
}
|
||||
float AM2320Component::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
bool AM2320Component::read_bytes_(uint8_t a_register, uint8_t *data, uint8_t len, uint32_t conversion) {
|
||||
if (!this->write_bytes(a_register, data, 2)) {
|
||||
|
||||
@@ -11,6 +11,7 @@ class AM2320Component : public PollingComponent, public i2c::I2CDevice {
|
||||
public:
|
||||
void setup() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
void update() override;
|
||||
|
||||
void set_temperature_sensor(sensor::Sensor *temperature_sensor) { temperature_sensor_ = temperature_sensor; }
|
||||
|
||||
@@ -1,12 +1,21 @@
|
||||
#include "am43_base.h"
|
||||
#include "esphome/core/helpers.h"
|
||||
#include <cstring>
|
||||
#include <cstdio>
|
||||
|
||||
namespace esphome {
|
||||
namespace am43 {
|
||||
|
||||
const uint8_t START_PACKET[5] = {0x00, 0xff, 0x00, 0x00, 0x9a};
|
||||
|
||||
std::string pkt_to_hex(const uint8_t *data, uint16_t len) {
|
||||
char buf[64];
|
||||
memset(buf, 0, 64);
|
||||
for (int i = 0; i < len; i++)
|
||||
sprintf(&buf[i * 2], "%02x", data[i]);
|
||||
std::string ret = buf;
|
||||
return ret;
|
||||
}
|
||||
|
||||
Am43Packet *Am43Encoder::get_battery_level_request() {
|
||||
uint8_t data = 0x1;
|
||||
return this->encode_(0xA2, &data, 1);
|
||||
@@ -64,9 +73,7 @@ Am43Packet *Am43Encoder::encode_(uint8_t command, uint8_t *data, uint8_t length)
|
||||
memcpy(&this->packet_.data[7], data, length);
|
||||
this->packet_.length = length + 7;
|
||||
this->checksum_();
|
||||
char hex_buf[format_hex_size(sizeof(this->packet_.data))];
|
||||
ESP_LOGV("am43", "ENC(%d): 0x%s", this->packet_.length,
|
||||
format_hex_to(hex_buf, this->packet_.data, this->packet_.length));
|
||||
ESP_LOGV("am43", "ENC(%d): 0x%s", packet_.length, pkt_to_hex(packet_.data, packet_.length).c_str());
|
||||
return &this->packet_;
|
||||
}
|
||||
|
||||
@@ -81,8 +88,7 @@ void Am43Decoder::decode(const uint8_t *data, uint16_t length) {
|
||||
this->has_set_state_response_ = false;
|
||||
this->has_position_ = false;
|
||||
this->has_pin_response_ = false;
|
||||
char hex_buf[format_hex_size(24)]; // Max expected packet size
|
||||
ESP_LOGV("am43", "DEC(%d): 0x%s", length, format_hex_to(hex_buf, data, length));
|
||||
ESP_LOGV("am43", "DEC(%d): 0x%s", length, pkt_to_hex(data, length).c_str());
|
||||
|
||||
if (length < 2 || data[0] != 0x9a)
|
||||
return;
|
||||
|
||||
@@ -18,31 +18,31 @@ AnovaPacket *AnovaCodec::clean_packet_() {
|
||||
|
||||
AnovaPacket *AnovaCodec::get_read_device_status_request() {
|
||||
this->current_query_ = READ_DEVICE_STATUS;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), "%s", CMD_READ_DEVICE_STATUS);
|
||||
sprintf((char *) this->packet_.data, "%s", CMD_READ_DEVICE_STATUS);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
AnovaPacket *AnovaCodec::get_read_target_temp_request() {
|
||||
this->current_query_ = READ_TARGET_TEMPERATURE;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), "%s", CMD_READ_TARGET_TEMP);
|
||||
sprintf((char *) this->packet_.data, "%s", CMD_READ_TARGET_TEMP);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
AnovaPacket *AnovaCodec::get_read_current_temp_request() {
|
||||
this->current_query_ = READ_CURRENT_TEMPERATURE;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), "%s", CMD_READ_CURRENT_TEMP);
|
||||
sprintf((char *) this->packet_.data, "%s", CMD_READ_CURRENT_TEMP);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
AnovaPacket *AnovaCodec::get_read_unit_request() {
|
||||
this->current_query_ = READ_UNIT;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), "%s", CMD_READ_UNIT);
|
||||
sprintf((char *) this->packet_.data, "%s", CMD_READ_UNIT);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
AnovaPacket *AnovaCodec::get_read_data_request() {
|
||||
this->current_query_ = READ_DATA;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), "%s", CMD_READ_DATA);
|
||||
sprintf((char *) this->packet_.data, "%s", CMD_READ_DATA);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
@@ -50,25 +50,25 @@ AnovaPacket *AnovaCodec::get_set_target_temp_request(float temperature) {
|
||||
this->current_query_ = SET_TARGET_TEMPERATURE;
|
||||
if (this->fahrenheit_)
|
||||
temperature = ctof(temperature);
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), CMD_SET_TARGET_TEMP, temperature);
|
||||
sprintf((char *) this->packet_.data, CMD_SET_TARGET_TEMP, temperature);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
AnovaPacket *AnovaCodec::get_set_unit_request(char unit) {
|
||||
this->current_query_ = SET_UNIT;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), CMD_SET_TEMP_UNIT, unit);
|
||||
sprintf((char *) this->packet_.data, CMD_SET_TEMP_UNIT, unit);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
AnovaPacket *AnovaCodec::get_start_request() {
|
||||
this->current_query_ = START;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), "%s", CMD_START);
|
||||
sprintf((char *) this->packet_.data, CMD_START);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
AnovaPacket *AnovaCodec::get_stop_request() {
|
||||
this->current_query_ = STOP;
|
||||
snprintf((char *) this->packet_.data, sizeof(this->packet_.data), "%s", CMD_STOP);
|
||||
sprintf((char *) this->packet_.data, CMD_STOP);
|
||||
return this->clean_packet_();
|
||||
}
|
||||
|
||||
|
||||
@@ -384,6 +384,7 @@ void APDS9960::process_dataset_(int up, int down, int left, int right) {
|
||||
}
|
||||
}
|
||||
}
|
||||
float APDS9960::get_setup_priority() const { return setup_priority::DATA; }
|
||||
bool APDS9960::is_proximity_enabled_() const {
|
||||
return
|
||||
#ifdef USE_SENSOR
|
||||
|
||||
@@ -32,6 +32,7 @@ class APDS9960 : public PollingComponent, public i2c::I2CDevice {
|
||||
public:
|
||||
void setup() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
void update() override;
|
||||
void loop() override;
|
||||
|
||||
|
||||
@@ -4,7 +4,6 @@ import logging
|
||||
from esphome import automation
|
||||
from esphome.automation import Condition
|
||||
import esphome.codegen as cg
|
||||
from esphome.components.logger import request_log_listener
|
||||
from esphome.config_helpers import get_logger_level
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import (
|
||||
@@ -327,9 +326,6 @@ async def to_code(config: ConfigType) -> None:
|
||||
# Track controller registration for StaticVector sizing
|
||||
CORE.register_controller()
|
||||
|
||||
# Request a log listener slot for API log streaming
|
||||
request_log_listener()
|
||||
|
||||
cg.add(var.set_port(config[CONF_PORT]))
|
||||
cg.add(var.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
||||
cg.add(var.set_batch_delay(config[CONF_BATCH_DELAY]))
|
||||
|
||||
@@ -45,7 +45,6 @@ service APIConnection {
|
||||
rpc time_command (TimeCommandRequest) returns (void) {}
|
||||
rpc update_command (UpdateCommandRequest) returns (void) {}
|
||||
rpc valve_command (ValveCommandRequest) returns (void) {}
|
||||
rpc water_heater_command (WaterHeaterCommandRequest) returns (void) {}
|
||||
|
||||
rpc subscribe_bluetooth_le_advertisements(SubscribeBluetoothLEAdvertisementsRequest) returns (void) {}
|
||||
rpc bluetooth_device_request(BluetoothDeviceRequest) returns (void) {}
|
||||
@@ -67,8 +66,6 @@ service APIConnection {
|
||||
|
||||
rpc zwave_proxy_frame(ZWaveProxyFrame) returns (void) {}
|
||||
rpc zwave_proxy_request(ZWaveProxyRequest) returns (void) {}
|
||||
|
||||
rpc infrared_rf_transmit_raw_timings(InfraredRFTransmitRawTimingsRequest) returns (void) {}
|
||||
}
|
||||
|
||||
|
||||
@@ -766,7 +763,7 @@ message SubscribeHomeassistantServicesRequest {
|
||||
|
||||
message HomeassistantServiceMap {
|
||||
string key = 1;
|
||||
string value = 2;
|
||||
string value = 2 [(no_zero_copy) = true];
|
||||
}
|
||||
|
||||
message HomeassistantActionRequest {
|
||||
@@ -782,7 +779,7 @@ message HomeassistantActionRequest {
|
||||
bool is_event = 5;
|
||||
uint32 call_id = 6 [(field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES"];
|
||||
bool wants_response = 7 [(field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON"];
|
||||
string response_template = 8 [(field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON"];
|
||||
string response_template = 8 [(no_zero_copy) = true, (field_ifdef) = "USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON"];
|
||||
}
|
||||
|
||||
// Message sent by Home Assistant to ESPHome with service call response data
|
||||
@@ -2440,49 +2437,3 @@ message ZWaveProxyRequest {
|
||||
ZWaveProxyRequestType type = 1;
|
||||
bytes data = 2;
|
||||
}
|
||||
|
||||
// ==================== INFRARED ====================
|
||||
// Note: Feature and capability flag enums are defined in
|
||||
// esphome/components/infrared/infrared.h
|
||||
|
||||
// Listing of infrared instances
|
||||
message ListEntitiesInfraredResponse {
|
||||
option (id) = 135;
|
||||
option (base_class) = "InfoResponseProtoMessage";
|
||||
option (source) = SOURCE_SERVER;
|
||||
option (ifdef) = "USE_INFRARED";
|
||||
|
||||
string object_id = 1;
|
||||
fixed32 key = 2;
|
||||
string name = 3;
|
||||
string icon = 4 [(field_ifdef) = "USE_ENTITY_ICON"];
|
||||
bool disabled_by_default = 5;
|
||||
EntityCategory entity_category = 6;
|
||||
uint32 device_id = 7 [(field_ifdef) = "USE_DEVICES"];
|
||||
uint32 capabilities = 8; // Bitfield of InfraredCapabilityFlags
|
||||
}
|
||||
|
||||
// Command to transmit infrared/RF data using raw timings
|
||||
message InfraredRFTransmitRawTimingsRequest {
|
||||
option (id) = 136;
|
||||
option (source) = SOURCE_CLIENT;
|
||||
option (ifdef) = "USE_IR_RF";
|
||||
|
||||
uint32 device_id = 1 [(field_ifdef) = "USE_DEVICES"];
|
||||
fixed32 key = 2; // Key identifying the transmitter instance
|
||||
uint32 carrier_frequency = 3; // Carrier frequency in Hz
|
||||
uint32 repeat_count = 4; // Number of times to transmit (1 = once, 2 = twice, etc.)
|
||||
repeated sint32 timings = 5 [packed = true, (packed_buffer) = true]; // Raw timings in microseconds (zigzag-encoded): positive = mark (LED/TX on), negative = space (LED/TX off)
|
||||
}
|
||||
|
||||
// Event message for received infrared/RF data
|
||||
message InfraredRFReceiveEvent {
|
||||
option (id) = 137;
|
||||
option (source) = SOURCE_SERVER;
|
||||
option (ifdef) = "USE_IR_RF";
|
||||
option (no_delay) = true;
|
||||
|
||||
uint32 device_id = 1 [(field_ifdef) = "USE_DEVICES"];
|
||||
fixed32 key = 2; // Key identifying the receiver instance
|
||||
repeated sint32 timings = 3 [packed = true, (container_pointer_no_template) = "std::vector<int32_t>"]; // Raw timings in microseconds (zigzag-encoded): alternating mark/space periods
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -12,7 +12,6 @@
|
||||
#include "esphome/core/string_ref.h"
|
||||
|
||||
#include <functional>
|
||||
#include <limits>
|
||||
#include <vector>
|
||||
|
||||
namespace esphome::api {
|
||||
@@ -28,7 +27,7 @@ static constexpr size_t MAX_INITIAL_PER_BATCH = 34; // For clients >= AP
|
||||
static_assert(MAX_MESSAGES_PER_BATCH >= MAX_INITIAL_PER_BATCH,
|
||||
"MAX_MESSAGES_PER_BATCH must be >= MAX_INITIAL_PER_BATCH");
|
||||
|
||||
class APIConnection final : public APIServerConnectionBase {
|
||||
class APIConnection final : public APIServerConnection {
|
||||
public:
|
||||
friend class APIServer;
|
||||
friend class ListEntitiesIterator;
|
||||
@@ -39,80 +38,80 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
void loop();
|
||||
|
||||
bool send_list_info_done() {
|
||||
return this->schedule_message_(nullptr, ListEntitiesDoneResponse::MESSAGE_TYPE,
|
||||
ListEntitiesDoneResponse::ESTIMATED_SIZE);
|
||||
return this->schedule_message_(nullptr, &APIConnection::try_send_list_info_done,
|
||||
ListEntitiesDoneResponse::MESSAGE_TYPE, ListEntitiesDoneResponse::ESTIMATED_SIZE);
|
||||
}
|
||||
#ifdef USE_BINARY_SENSOR
|
||||
bool send_binary_sensor_state(binary_sensor::BinarySensor *binary_sensor);
|
||||
#endif
|
||||
#ifdef USE_COVER
|
||||
bool send_cover_state(cover::Cover *cover);
|
||||
void on_cover_command_request(const CoverCommandRequest &msg) override;
|
||||
void cover_command(const CoverCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_FAN
|
||||
bool send_fan_state(fan::Fan *fan);
|
||||
void on_fan_command_request(const FanCommandRequest &msg) override;
|
||||
void fan_command(const FanCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_LIGHT
|
||||
bool send_light_state(light::LightState *light);
|
||||
void on_light_command_request(const LightCommandRequest &msg) override;
|
||||
void light_command(const LightCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_SENSOR
|
||||
bool send_sensor_state(sensor::Sensor *sensor);
|
||||
#endif
|
||||
#ifdef USE_SWITCH
|
||||
bool send_switch_state(switch_::Switch *a_switch);
|
||||
void on_switch_command_request(const SwitchCommandRequest &msg) override;
|
||||
void switch_command(const SwitchCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_TEXT_SENSOR
|
||||
bool send_text_sensor_state(text_sensor::TextSensor *text_sensor);
|
||||
#endif
|
||||
#ifdef USE_CAMERA
|
||||
void set_camera_state(std::shared_ptr<camera::CameraImage> image);
|
||||
void on_camera_image_request(const CameraImageRequest &msg) override;
|
||||
void camera_image(const CameraImageRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_CLIMATE
|
||||
bool send_climate_state(climate::Climate *climate);
|
||||
void on_climate_command_request(const ClimateCommandRequest &msg) override;
|
||||
void climate_command(const ClimateCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_NUMBER
|
||||
bool send_number_state(number::Number *number);
|
||||
void on_number_command_request(const NumberCommandRequest &msg) override;
|
||||
void number_command(const NumberCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATE
|
||||
bool send_date_state(datetime::DateEntity *date);
|
||||
void on_date_command_request(const DateCommandRequest &msg) override;
|
||||
void date_command(const DateCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_TIME
|
||||
bool send_time_state(datetime::TimeEntity *time);
|
||||
void on_time_command_request(const TimeCommandRequest &msg) override;
|
||||
void time_command(const TimeCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATETIME
|
||||
bool send_datetime_state(datetime::DateTimeEntity *datetime);
|
||||
void on_date_time_command_request(const DateTimeCommandRequest &msg) override;
|
||||
void datetime_command(const DateTimeCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_TEXT
|
||||
bool send_text_state(text::Text *text);
|
||||
void on_text_command_request(const TextCommandRequest &msg) override;
|
||||
void text_command(const TextCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_SELECT
|
||||
bool send_select_state(select::Select *select);
|
||||
void on_select_command_request(const SelectCommandRequest &msg) override;
|
||||
void select_command(const SelectCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BUTTON
|
||||
void on_button_command_request(const ButtonCommandRequest &msg) override;
|
||||
void button_command(const ButtonCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_LOCK
|
||||
bool send_lock_state(lock::Lock *a_lock);
|
||||
void on_lock_command_request(const LockCommandRequest &msg) override;
|
||||
void lock_command(const LockCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_VALVE
|
||||
bool send_valve_state(valve::Valve *valve);
|
||||
void on_valve_command_request(const ValveCommandRequest &msg) override;
|
||||
void valve_command(const ValveCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_MEDIA_PLAYER
|
||||
bool send_media_player_state(media_player::MediaPlayer *media_player);
|
||||
void on_media_player_command_request(const MediaPlayerCommandRequest &msg) override;
|
||||
void media_player_command(const MediaPlayerCommandRequest &msg) override;
|
||||
#endif
|
||||
bool try_send_log_message(int level, const char *tag, const char *line, size_t message_len);
|
||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
||||
@@ -126,18 +125,18 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
|
||||
#endif // USE_API_HOMEASSISTANT_SERVICES
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_subscribe_bluetooth_le_advertisements_request(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||
void on_unsubscribe_bluetooth_le_advertisements_request() override;
|
||||
void subscribe_bluetooth_le_advertisements(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||
void unsubscribe_bluetooth_le_advertisements(const UnsubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||
|
||||
void on_bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
|
||||
void on_bluetooth_gatt_read_request(const BluetoothGATTReadRequest &msg) override;
|
||||
void on_bluetooth_gatt_write_request(const BluetoothGATTWriteRequest &msg) override;
|
||||
void on_bluetooth_gatt_read_descriptor_request(const BluetoothGATTReadDescriptorRequest &msg) override;
|
||||
void on_bluetooth_gatt_write_descriptor_request(const BluetoothGATTWriteDescriptorRequest &msg) override;
|
||||
void on_bluetooth_gatt_get_services_request(const BluetoothGATTGetServicesRequest &msg) override;
|
||||
void on_bluetooth_gatt_notify_request(const BluetoothGATTNotifyRequest &msg) override;
|
||||
void on_subscribe_bluetooth_connections_free_request() override;
|
||||
void on_bluetooth_scanner_set_mode_request(const BluetoothScannerSetModeRequest &msg) override;
|
||||
void bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
|
||||
void bluetooth_gatt_read(const BluetoothGATTReadRequest &msg) override;
|
||||
void bluetooth_gatt_write(const BluetoothGATTWriteRequest &msg) override;
|
||||
void bluetooth_gatt_read_descriptor(const BluetoothGATTReadDescriptorRequest &msg) override;
|
||||
void bluetooth_gatt_write_descriptor(const BluetoothGATTWriteDescriptorRequest &msg) override;
|
||||
void bluetooth_gatt_get_services(const BluetoothGATTGetServicesRequest &msg) override;
|
||||
void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) override;
|
||||
bool send_subscribe_bluetooth_connections_free_response(const SubscribeBluetoothConnectionsFreeRequest &msg) override;
|
||||
void bluetooth_scanner_set_mode(const BluetoothScannerSetModeRequest &msg) override;
|
||||
|
||||
#endif
|
||||
#ifdef USE_HOMEASSISTANT_TIME
|
||||
@@ -148,24 +147,24 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
#endif
|
||||
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
void on_subscribe_voice_assistant_request(const SubscribeVoiceAssistantRequest &msg) override;
|
||||
void subscribe_voice_assistant(const SubscribeVoiceAssistantRequest &msg) override;
|
||||
void on_voice_assistant_response(const VoiceAssistantResponse &msg) override;
|
||||
void on_voice_assistant_event_response(const VoiceAssistantEventResponse &msg) override;
|
||||
void on_voice_assistant_audio(const VoiceAssistantAudio &msg) override;
|
||||
void on_voice_assistant_timer_event_response(const VoiceAssistantTimerEventResponse &msg) override;
|
||||
void on_voice_assistant_announce_request(const VoiceAssistantAnnounceRequest &msg) override;
|
||||
void on_voice_assistant_configuration_request(const VoiceAssistantConfigurationRequest &msg) override;
|
||||
void on_voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
|
||||
bool send_voice_assistant_get_configuration_response(const VoiceAssistantConfigurationRequest &msg) override;
|
||||
void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
|
||||
#endif
|
||||
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
void on_z_wave_proxy_frame(const ZWaveProxyFrame &msg) override;
|
||||
void on_z_wave_proxy_request(const ZWaveProxyRequest &msg) override;
|
||||
void zwave_proxy_frame(const ZWaveProxyFrame &msg) override;
|
||||
void zwave_proxy_request(const ZWaveProxyRequest &msg) override;
|
||||
#endif
|
||||
|
||||
#ifdef USE_ALARM_CONTROL_PANEL
|
||||
bool send_alarm_control_panel_state(alarm_control_panel::AlarmControlPanel *a_alarm_control_panel);
|
||||
void on_alarm_control_panel_command_request(const AlarmControlPanelCommandRequest &msg) override;
|
||||
void alarm_control_panel_command(const AlarmControlPanelCommandRequest &msg) override;
|
||||
#endif
|
||||
|
||||
#ifdef USE_WATER_HEATER
|
||||
@@ -173,22 +172,17 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
void on_water_heater_command_request(const WaterHeaterCommandRequest &msg) override;
|
||||
#endif
|
||||
|
||||
#ifdef USE_IR_RF
|
||||
void on_infrared_rf_transmit_raw_timings_request(const InfraredRFTransmitRawTimingsRequest &msg) override;
|
||||
void send_infrared_rf_receive_event(const InfraredRFReceiveEvent &msg);
|
||||
#endif
|
||||
|
||||
#ifdef USE_EVENT
|
||||
void send_event(event::Event *event);
|
||||
void send_event(event::Event *event, const char *event_type);
|
||||
#endif
|
||||
|
||||
#ifdef USE_UPDATE
|
||||
bool send_update_state(update::UpdateEntity *update);
|
||||
void on_update_command_request(const UpdateCommandRequest &msg) override;
|
||||
void update_command(const UpdateCommandRequest &msg) override;
|
||||
#endif
|
||||
|
||||
void on_disconnect_response() override;
|
||||
void on_ping_response() override {
|
||||
void on_disconnect_response(const DisconnectResponse &value) override;
|
||||
void on_ping_response(const PingResponse &value) override {
|
||||
// we initiated ping
|
||||
this->flags_.sent_ping = false;
|
||||
}
|
||||
@@ -198,12 +192,12 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
#ifdef USE_HOMEASSISTANT_TIME
|
||||
void on_get_time_response(const GetTimeResponse &value) override;
|
||||
#endif
|
||||
void on_hello_request(const HelloRequest &msg) override;
|
||||
void on_disconnect_request() override;
|
||||
void on_ping_request() override;
|
||||
void on_device_info_request() override;
|
||||
void on_list_entities_request() override { this->begin_iterator_(ActiveIterator::LIST_ENTITIES); }
|
||||
void on_subscribe_states_request() override {
|
||||
bool send_hello_response(const HelloRequest &msg) override;
|
||||
bool send_disconnect_response(const DisconnectRequest &msg) override;
|
||||
bool send_ping_response(const PingRequest &msg) override;
|
||||
bool send_device_info_response(const DeviceInfoRequest &msg) override;
|
||||
void list_entities(const ListEntitiesRequest &msg) override { this->begin_iterator_(ActiveIterator::LIST_ENTITIES); }
|
||||
void subscribe_states(const SubscribeStatesRequest &msg) override {
|
||||
this->flags_.state_subscription = true;
|
||||
// Start initial state iterator only if no iterator is active
|
||||
// If list_entities is running, we'll start initial_state when it completes
|
||||
@@ -211,19 +205,21 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
this->begin_iterator_(ActiveIterator::INITIAL_STATE);
|
||||
}
|
||||
}
|
||||
void on_subscribe_logs_request(const SubscribeLogsRequest &msg) override {
|
||||
void subscribe_logs(const SubscribeLogsRequest &msg) override {
|
||||
this->flags_.log_subscription = msg.level;
|
||||
if (msg.dump_config)
|
||||
App.schedule_dump_config();
|
||||
}
|
||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
||||
void on_subscribe_homeassistant_services_request() override { this->flags_.service_call_subscription = true; }
|
||||
void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) override {
|
||||
this->flags_.service_call_subscription = true;
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
||||
void on_subscribe_home_assistant_states_request() override;
|
||||
void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_API_USER_DEFINED_ACTIONS
|
||||
void on_execute_service_request(const ExecuteServiceRequest &msg) override;
|
||||
void execute_service(const ExecuteServiceRequest &msg) override;
|
||||
#ifdef USE_API_USER_DEFINED_ACTION_RESPONSES
|
||||
void send_execute_service_response(uint32_t call_id, bool success, StringRef error_message);
|
||||
#ifdef USE_API_USER_DEFINED_ACTION_RESPONSES_JSON
|
||||
@@ -233,7 +229,7 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
#endif // USE_API_USER_DEFINED_ACTION_RESPONSES
|
||||
#endif
|
||||
#ifdef USE_API_NOISE
|
||||
void on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &msg) override;
|
||||
bool send_noise_encryption_set_key_response(const NoiseEncryptionSetKeyRequest &msg) override;
|
||||
#endif
|
||||
|
||||
bool is_authenticated() override {
|
||||
@@ -253,7 +249,17 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
|
||||
void on_fatal_error() override;
|
||||
void on_no_setup_connection() override;
|
||||
bool send_message_impl(const ProtoMessage &msg, uint8_t message_type) override;
|
||||
ProtoWriteBuffer create_buffer(uint32_t reserve_size) override {
|
||||
// FIXME: ensure no recursive writes can happen
|
||||
|
||||
// Get header padding size - used for both reserve and insert
|
||||
uint8_t header_padding = this->helper_->frame_header_padding();
|
||||
// Get shared buffer from parent server
|
||||
std::vector<uint8_t> &shared_buf = this->parent_->get_shared_buffer_ref();
|
||||
this->prepare_first_message_buffer(shared_buf, header_padding,
|
||||
reserve_size + header_padding + this->helper_->frame_footer_size());
|
||||
return {&shared_buf};
|
||||
}
|
||||
|
||||
void prepare_first_message_buffer(std::vector<uint8_t> &shared_buf, size_t header_padding, size_t total_size) {
|
||||
shared_buf.clear();
|
||||
@@ -265,41 +271,17 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
shared_buf.resize(header_padding);
|
||||
}
|
||||
|
||||
// Convenience overload - computes frame overhead internally
|
||||
void prepare_first_message_buffer(std::vector<uint8_t> &shared_buf, size_t payload_size) {
|
||||
const uint8_t header_padding = this->helper_->frame_header_padding();
|
||||
const uint8_t footer_size = this->helper_->frame_footer_size();
|
||||
this->prepare_first_message_buffer(shared_buf, header_padding, payload_size + header_padding + footer_size);
|
||||
}
|
||||
|
||||
bool try_to_clear_buffer(bool log_out_of_space);
|
||||
bool send_buffer(ProtoWriteBuffer buffer, uint8_t message_type) override;
|
||||
|
||||
const char *get_name() const { return this->helper_->get_client_name(); }
|
||||
/// Get peer name (IP address) into caller-provided buffer, returns buf for convenience
|
||||
const char *get_peername_to(std::span<char, socket::SOCKADDR_STR_LEN> buf) const {
|
||||
return this->helper_->get_peername_to(buf);
|
||||
}
|
||||
/// Get peer name (IP address) - cached at connection init time
|
||||
const char *get_peername() const { return this->helper_->get_client_peername(); }
|
||||
|
||||
protected:
|
||||
// Helper function to handle authentication completion
|
||||
void complete_authentication_();
|
||||
|
||||
// Pattern B helpers: send response and return success/failure
|
||||
bool send_hello_response_(const HelloRequest &msg);
|
||||
bool send_disconnect_response_();
|
||||
bool send_ping_response_();
|
||||
bool send_device_info_response_();
|
||||
#ifdef USE_API_NOISE
|
||||
bool send_noise_encryption_set_key_response_(const NoiseEncryptionSetKeyRequest &msg);
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
bool send_subscribe_bluetooth_connections_free_response_();
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
bool send_voice_assistant_get_configuration_response_(const VoiceAssistantConfigurationRequest &msg);
|
||||
#endif
|
||||
|
||||
#ifdef USE_CAMERA
|
||||
void try_send_camera_image_();
|
||||
#endif
|
||||
@@ -310,21 +292,21 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
|
||||
// Non-template helper to encode any ProtoMessage
|
||||
static uint16_t encode_message_to_buffer(ProtoMessage &msg, uint8_t message_type, APIConnection *conn,
|
||||
uint32_t remaining_size);
|
||||
uint32_t remaining_size, bool is_single);
|
||||
|
||||
// Helper to fill entity state base and encode message
|
||||
static uint16_t fill_and_encode_entity_state(EntityBase *entity, StateResponseProtoMessage &msg, uint8_t message_type,
|
||||
APIConnection *conn, uint32_t remaining_size) {
|
||||
APIConnection *conn, uint32_t remaining_size, bool is_single) {
|
||||
msg.key = entity->get_object_id_hash();
|
||||
#ifdef USE_DEVICES
|
||||
msg.device_id = entity->get_device_id();
|
||||
#endif
|
||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size);
|
||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size, is_single);
|
||||
}
|
||||
|
||||
// Helper to fill entity info base and encode message
|
||||
static uint16_t fill_and_encode_entity_info(EntityBase *entity, InfoResponseProtoMessage &msg, uint8_t message_type,
|
||||
APIConnection *conn, uint32_t remaining_size) {
|
||||
APIConnection *conn, uint32_t remaining_size, bool is_single) {
|
||||
// Set common fields that are shared by all entity types
|
||||
msg.key = entity->get_object_id_hash();
|
||||
|
||||
@@ -351,7 +333,7 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
#ifdef USE_DEVICES
|
||||
msg.device_id = entity->get_device_id();
|
||||
#endif
|
||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size);
|
||||
return encode_message_to_buffer(msg, message_type, conn, remaining_size, is_single);
|
||||
}
|
||||
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
@@ -382,108 +364,137 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
}
|
||||
|
||||
#ifdef USE_BINARY_SENSOR
|
||||
static uint16_t try_send_binary_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_binary_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_binary_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_binary_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_COVER
|
||||
static uint16_t try_send_cover_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_cover_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_cover_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_cover_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_FAN
|
||||
static uint16_t try_send_fan_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_fan_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_fan_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
static uint16_t try_send_fan_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_LIGHT
|
||||
static uint16_t try_send_light_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_light_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_light_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_light_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_SENSOR
|
||||
static uint16_t try_send_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_SWITCH
|
||||
static uint16_t try_send_switch_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_switch_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_switch_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_switch_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_TEXT_SENSOR
|
||||
static uint16_t try_send_text_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_text_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_text_sensor_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_text_sensor_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_CLIMATE
|
||||
static uint16_t try_send_climate_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_climate_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_climate_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_climate_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_NUMBER
|
||||
static uint16_t try_send_number_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_number_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_number_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_number_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATE
|
||||
static uint16_t try_send_date_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_date_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_date_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
static uint16_t try_send_date_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_DATETIME_TIME
|
||||
static uint16_t try_send_time_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_time_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_time_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
static uint16_t try_send_time_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATETIME
|
||||
static uint16_t try_send_datetime_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_datetime_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_datetime_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_datetime_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_TEXT
|
||||
static uint16_t try_send_text_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_text_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_text_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
static uint16_t try_send_text_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_SELECT
|
||||
static uint16_t try_send_select_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_select_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_select_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_select_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_BUTTON
|
||||
static uint16_t try_send_button_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_button_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_LOCK
|
||||
static uint16_t try_send_lock_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_lock_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_lock_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
static uint16_t try_send_lock_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_VALVE
|
||||
static uint16_t try_send_valve_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_valve_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_valve_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_valve_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_MEDIA_PLAYER
|
||||
static uint16_t try_send_media_player_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_media_player_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_media_player_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_media_player_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_ALARM_CONTROL_PANEL
|
||||
static uint16_t try_send_alarm_control_panel_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_alarm_control_panel_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_alarm_control_panel_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_alarm_control_panel_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_WATER_HEATER
|
||||
static uint16_t try_send_water_heater_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_water_heater_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
#endif
|
||||
#ifdef USE_INFRARED
|
||||
static uint16_t try_send_infrared_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_water_heater_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_water_heater_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_EVENT
|
||||
static uint16_t try_send_event_response(event::Event *event, StringRef event_type, APIConnection *conn,
|
||||
uint32_t remaining_size);
|
||||
static uint16_t try_send_event_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_event_response(event::Event *event, const char *event_type, APIConnection *conn,
|
||||
uint32_t remaining_size, bool is_single);
|
||||
static uint16_t try_send_event_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single);
|
||||
#endif
|
||||
#ifdef USE_UPDATE
|
||||
static uint16_t try_send_update_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_update_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_update_state(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
static uint16_t try_send_update_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
#ifdef USE_CAMERA
|
||||
static uint16_t try_send_camera_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_camera_info(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
#endif
|
||||
|
||||
// Method for ListEntitiesDone batching
|
||||
static uint16_t try_send_list_info_done(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_list_info_done(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
|
||||
// Method for DisconnectRequest batching
|
||||
static uint16_t try_send_disconnect_request(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_disconnect_request(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
|
||||
// Batch message method for ping requests
|
||||
static uint16_t try_send_ping_request(EntityBase *entity, APIConnection *conn, uint32_t remaining_size);
|
||||
static uint16_t try_send_ping_request(EntityBase *entity, APIConnection *conn, uint32_t remaining_size,
|
||||
bool is_single);
|
||||
|
||||
// === Optimal member ordering for 32-bit systems ===
|
||||
|
||||
@@ -518,19 +529,35 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
#endif
|
||||
|
||||
// Function pointer type for message encoding
|
||||
using MessageCreatorPtr = uint16_t (*)(EntityBase *, APIConnection *, uint32_t remaining_size);
|
||||
using MessageCreatorPtr = uint16_t (*)(EntityBase *, APIConnection *, uint32_t remaining_size, bool is_single);
|
||||
|
||||
class MessageCreator {
|
||||
public:
|
||||
MessageCreator(MessageCreatorPtr ptr) { data_.function_ptr = ptr; }
|
||||
explicit MessageCreator(const char *str_value) { data_.const_char_ptr = str_value; }
|
||||
|
||||
// Call operator - uses message_type to determine union type
|
||||
uint16_t operator()(EntityBase *entity, APIConnection *conn, uint32_t remaining_size, bool is_single,
|
||||
uint8_t message_type) const;
|
||||
|
||||
private:
|
||||
union Data {
|
||||
MessageCreatorPtr function_ptr;
|
||||
const char *const_char_ptr;
|
||||
} data_; // 4 bytes on 32-bit, 8 bytes on 64-bit
|
||||
};
|
||||
|
||||
// Generic batching mechanism for both state updates and entity info
|
||||
struct DeferredBatch {
|
||||
// Sentinel value for unused aux_data_index
|
||||
static constexpr uint8_t AUX_DATA_UNUSED = std::numeric_limits<uint8_t>::max();
|
||||
|
||||
struct BatchItem {
|
||||
EntityBase *entity; // 4 bytes - Entity pointer
|
||||
uint8_t message_type; // 1 byte - Message type for protocol and dispatch
|
||||
uint8_t estimated_size; // 1 byte - Estimated message size (max 255 bytes)
|
||||
uint8_t aux_data_index{AUX_DATA_UNUSED}; // 1 byte - For events: index into entity's event_types
|
||||
// 1 byte padding
|
||||
EntityBase *entity; // Entity pointer
|
||||
MessageCreator creator; // Function that creates the message when needed
|
||||
uint8_t message_type; // Message type for overhead calculation (max 255)
|
||||
uint8_t estimated_size; // Estimated message size (max 255 bytes)
|
||||
|
||||
// Constructor for creating BatchItem
|
||||
BatchItem(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size)
|
||||
: entity(entity), creator(creator), message_type(message_type), estimated_size(estimated_size) {}
|
||||
};
|
||||
|
||||
std::vector<BatchItem> items;
|
||||
@@ -539,11 +566,10 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
// No pre-allocation - log connections never use batching, and for
|
||||
// connections that do, buffers are released after initial sync anyway
|
||||
|
||||
// Add item to the batch (with deduplication)
|
||||
void add_item(EntityBase *entity, uint8_t message_type, uint8_t estimated_size,
|
||||
uint8_t aux_data_index = AUX_DATA_UNUSED);
|
||||
// Add item to the batch
|
||||
void add_item(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size);
|
||||
// Add item to the front of the batch (for high priority messages like ping)
|
||||
void add_item_front(EntityBase *entity, uint8_t message_type, uint8_t estimated_size);
|
||||
void add_item_front(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size);
|
||||
|
||||
// Clear all items
|
||||
void clear() {
|
||||
@@ -557,7 +583,6 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
bool empty() const { return items.empty(); }
|
||||
size_t size() const { return items.size(); }
|
||||
const BatchItem &operator[](size_t index) const { return items[index]; }
|
||||
|
||||
// Release excess capacity - only releases if items already empty
|
||||
void release_buffer() {
|
||||
// Safe to call: batch is processed before release_buffer is called,
|
||||
@@ -629,16 +654,18 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
this->flags_.batch_scheduled = false;
|
||||
}
|
||||
|
||||
// Dispatch message encoding based on message_type - replaces function pointer storage
|
||||
// Switch assigns pointer, single call site for smaller code size
|
||||
uint16_t dispatch_message_(const DeferredBatch::BatchItem &item, uint32_t remaining_size, bool batch_first);
|
||||
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
void log_batch_item_(const DeferredBatch::BatchItem &item) {
|
||||
// Helper to log a proto message from a MessageCreator object
|
||||
void log_proto_message_(EntityBase *entity, const MessageCreator &creator, uint8_t message_type) {
|
||||
this->flags_.log_only_mode = true;
|
||||
this->dispatch_message_(item, MAX_BATCH_PACKET_SIZE, true);
|
||||
creator(entity, this, MAX_BATCH_PACKET_SIZE, true, message_type);
|
||||
this->flags_.log_only_mode = false;
|
||||
}
|
||||
|
||||
void log_batch_item_(const DeferredBatch::BatchItem &item) {
|
||||
// Use the helper to log the message
|
||||
this->log_proto_message_(item.entity, item.creator, item.message_type);
|
||||
}
|
||||
#endif
|
||||
|
||||
// Helper to check if a message type should bypass batching
|
||||
@@ -662,19 +689,63 @@ class APIConnection final : public APIServerConnectionBase {
|
||||
// Helper method to send a message either immediately or via batching
|
||||
// Tries immediate send if should_send_immediately_() returns true and buffer has space
|
||||
// Falls back to batching if immediate send fails or isn't applicable
|
||||
bool send_message_smart_(EntityBase *entity, uint8_t message_type, uint8_t estimated_size,
|
||||
uint8_t aux_data_index = DeferredBatch::AUX_DATA_UNUSED);
|
||||
bool send_message_smart_(EntityBase *entity, MessageCreatorPtr creator, uint8_t message_type,
|
||||
uint8_t estimated_size) {
|
||||
if (this->should_send_immediately_(message_type) && this->helper_->can_write_without_blocking()) {
|
||||
// Now actually encode and send
|
||||
if (creator(entity, this, MAX_BATCH_PACKET_SIZE, true) &&
|
||||
this->send_buffer(ProtoWriteBuffer{&this->parent_->get_shared_buffer_ref()}, message_type)) {
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
// Log the message in verbose mode
|
||||
this->log_proto_message_(entity, MessageCreator(creator), message_type);
|
||||
#endif
|
||||
return true;
|
||||
}
|
||||
|
||||
// If immediate send failed, fall through to batching
|
||||
}
|
||||
|
||||
// Fall back to scheduled batching
|
||||
return this->schedule_message_(entity, creator, message_type, estimated_size);
|
||||
}
|
||||
|
||||
// Overload for MessageCreator (used by events which need to capture event_type)
|
||||
bool send_message_smart_(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size) {
|
||||
// Try to send immediately if message type should bypass batching and buffer has space
|
||||
if (this->should_send_immediately_(message_type) && this->helper_->can_write_without_blocking()) {
|
||||
// Now actually encode and send
|
||||
if (creator(entity, this, MAX_BATCH_PACKET_SIZE, true, message_type) &&
|
||||
this->send_buffer(ProtoWriteBuffer{&this->parent_->get_shared_buffer_ref()}, message_type)) {
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
// Log the message in verbose mode
|
||||
this->log_proto_message_(entity, creator, message_type);
|
||||
#endif
|
||||
return true;
|
||||
}
|
||||
|
||||
// If immediate send failed, fall through to batching
|
||||
}
|
||||
|
||||
// Fall back to scheduled batching
|
||||
return this->schedule_message_(entity, creator, message_type, estimated_size);
|
||||
}
|
||||
|
||||
// Helper function to schedule a deferred message with known message type
|
||||
bool schedule_message_(EntityBase *entity, uint8_t message_type, uint8_t estimated_size,
|
||||
uint8_t aux_data_index = DeferredBatch::AUX_DATA_UNUSED) {
|
||||
this->deferred_batch_.add_item(entity, message_type, estimated_size, aux_data_index);
|
||||
bool schedule_message_(EntityBase *entity, MessageCreator creator, uint8_t message_type, uint8_t estimated_size) {
|
||||
this->deferred_batch_.add_item(entity, creator, message_type, estimated_size);
|
||||
return this->schedule_batch_();
|
||||
}
|
||||
|
||||
// Overload for function pointers (for info messages and current state reads)
|
||||
bool schedule_message_(EntityBase *entity, MessageCreatorPtr function_ptr, uint8_t message_type,
|
||||
uint8_t estimated_size) {
|
||||
return schedule_message_(entity, MessageCreator(function_ptr), message_type, estimated_size);
|
||||
}
|
||||
|
||||
// Helper function to schedule a high priority message at the front of the batch
|
||||
bool schedule_message_front_(EntityBase *entity, uint8_t message_type, uint8_t estimated_size) {
|
||||
this->deferred_batch_.add_item_front(entity, message_type, estimated_size);
|
||||
bool schedule_message_front_(EntityBase *entity, MessageCreatorPtr function_ptr, uint8_t message_type,
|
||||
uint8_t estimated_size) {
|
||||
this->deferred_batch_.add_item_front(entity, MessageCreator(function_ptr), message_type, estimated_size);
|
||||
return this->schedule_batch_();
|
||||
}
|
||||
|
||||
|
||||
@@ -16,12 +16,7 @@ static const char *const TAG = "api.frame_helper";
|
||||
static constexpr size_t API_MAX_LOG_BYTES = 168;
|
||||
|
||||
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERY_VERBOSE
|
||||
#define HELPER_LOG(msg, ...) \
|
||||
do { \
|
||||
char peername_buf[socket::SOCKADDR_STR_LEN]; \
|
||||
this->get_peername_to(peername_buf); \
|
||||
ESP_LOGVV(TAG, "%s (%s): " msg, this->client_name_, peername_buf, ##__VA_ARGS__); \
|
||||
} while (0)
|
||||
#define HELPER_LOG(msg, ...) ESP_LOGVV(TAG, "%s (%s): " msg, this->client_name_, this->client_peername_, ##__VA_ARGS__)
|
||||
#else
|
||||
#define HELPER_LOG(msg, ...) ((void) 0)
|
||||
#endif
|
||||
@@ -245,20 +240,13 @@ APIError APIFrameHelper::try_send_tx_buf_() {
|
||||
return APIError::OK; // All buffers sent successfully
|
||||
}
|
||||
|
||||
const char *APIFrameHelper::get_peername_to(std::span<char, socket::SOCKADDR_STR_LEN> buf) const {
|
||||
if (this->socket_) {
|
||||
this->socket_->getpeername_to(buf);
|
||||
} else {
|
||||
buf[0] = '\0';
|
||||
}
|
||||
return buf.data();
|
||||
}
|
||||
|
||||
APIError APIFrameHelper::init_common_() {
|
||||
if (state_ != State::INITIALIZE || this->socket_ == nullptr) {
|
||||
HELPER_LOG("Bad state for init %d", (int) state_);
|
||||
return APIError::BAD_STATE;
|
||||
}
|
||||
// Cache peername now while socket is valid - needed for error logging after socket failure
|
||||
this->socket_->getpeername_to(this->client_peername_);
|
||||
int err = this->socket_->setblocking(false);
|
||||
if (err != 0) {
|
||||
state_ = State::FAILED;
|
||||
|
||||
@@ -90,9 +90,8 @@ class APIFrameHelper {
|
||||
|
||||
// Get client name (null-terminated)
|
||||
const char *get_client_name() const { return this->client_name_; }
|
||||
// Get client peername/IP into caller-provided buffer (fetches on-demand from socket)
|
||||
// Returns pointer to buf for convenience in printf-style calls
|
||||
const char *get_peername_to(std::span<char, socket::SOCKADDR_STR_LEN> buf) const;
|
||||
// Get client peername/IP (null-terminated, cached at init time for availability after socket failure)
|
||||
const char *get_client_peername() const { return this->client_peername_; }
|
||||
// Set client name from buffer with length (truncates if needed)
|
||||
void set_client_name(const char *name, size_t len) {
|
||||
size_t copy_len = std::min(len, sizeof(this->client_name_) - 1);
|
||||
@@ -106,8 +105,6 @@ class APIFrameHelper {
|
||||
bool can_write_without_blocking() { return this->state_ == State::DATA && this->tx_buf_count_ == 0; }
|
||||
int getpeername(struct sockaddr *addr, socklen_t *addrlen) { return socket_->getpeername(addr, addrlen); }
|
||||
APIError close() {
|
||||
if (state_ == State::CLOSED)
|
||||
return APIError::OK; // Already closed
|
||||
state_ = State::CLOSED;
|
||||
int err = this->socket_->close();
|
||||
if (err == -1)
|
||||
@@ -123,39 +120,26 @@ class APIFrameHelper {
|
||||
}
|
||||
return APIError::OK;
|
||||
}
|
||||
// Manage TCP_NODELAY (Nagle's algorithm) based on message type.
|
||||
//
|
||||
// For non-log messages (sensor data, state updates): Always disable Nagle
|
||||
// (NODELAY on) for immediate delivery - these are time-sensitive.
|
||||
//
|
||||
// For log messages: Use Nagle to coalesce multiple small log packets into
|
||||
// fewer larger packets, reducing WiFi overhead. However, we limit batching
|
||||
// to 3 messages to avoid excessive LWIP buffer pressure on memory-constrained
|
||||
// devices like ESP8266. LWIP's TCP_OVERSIZE option coalesces the data into
|
||||
// shared pbufs, but holding data too long waiting for Nagle's timer causes
|
||||
// buffer exhaustion and dropped messages.
|
||||
//
|
||||
// Flow: Log 1 (Nagle on) -> Log 2 (Nagle on) -> Log 3 (NODELAY, flush all)
|
||||
//
|
||||
void set_nodelay_for_message(bool is_log_message) {
|
||||
if (!is_log_message) {
|
||||
if (this->nodelay_state_ != NODELAY_ON) {
|
||||
this->set_nodelay_raw_(true);
|
||||
this->nodelay_state_ = NODELAY_ON;
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Log messages 1-3: state transitions -1 -> 1 -> 2 -> -1 (flush on 3rd)
|
||||
if (this->nodelay_state_ == NODELAY_ON) {
|
||||
this->set_nodelay_raw_(false);
|
||||
this->nodelay_state_ = 1;
|
||||
} else if (this->nodelay_state_ >= LOG_NAGLE_COUNT) {
|
||||
this->set_nodelay_raw_(true);
|
||||
this->nodelay_state_ = NODELAY_ON;
|
||||
} else {
|
||||
this->nodelay_state_++;
|
||||
/// Toggle TCP_NODELAY socket option to control Nagle's algorithm.
|
||||
///
|
||||
/// This is used to allow log messages to coalesce (Nagle enabled) while keeping
|
||||
/// state updates low-latency (NODELAY enabled). Without this, many small log
|
||||
/// packets fill the TCP send buffer, crowding out important state updates.
|
||||
///
|
||||
/// State is tracked to minimize setsockopt() overhead - on lwip_raw (ESP8266/RP2040)
|
||||
/// this is just a boolean assignment; on other platforms it's a lightweight syscall.
|
||||
///
|
||||
/// @param enable true to enable NODELAY (disable Nagle), false to enable Nagle
|
||||
/// @return true if successful or already in desired state
|
||||
bool set_nodelay(bool enable) {
|
||||
if (this->nodelay_enabled_ == enable)
|
||||
return true;
|
||||
int val = enable ? 1 : 0;
|
||||
int err = this->socket_->setsockopt(IPPROTO_TCP, TCP_NODELAY, &val, sizeof(int));
|
||||
if (err == 0) {
|
||||
this->nodelay_enabled_ = enable;
|
||||
}
|
||||
return err == 0;
|
||||
}
|
||||
virtual APIError write_protobuf_packet(uint8_t type, ProtoWriteBuffer buffer) = 0;
|
||||
// Write multiple protobuf messages in a single operation
|
||||
@@ -234,6 +218,8 @@ class APIFrameHelper {
|
||||
|
||||
// Client name buffer - stores name from Hello message or initial peername
|
||||
char client_name_[CLIENT_INFO_NAME_MAX_LEN]{};
|
||||
// Cached peername/IP address - captured at init time for availability after socket failure
|
||||
char client_peername_[socket::SOCKADDR_STR_LEN]{};
|
||||
|
||||
// Group smaller types together
|
||||
uint16_t rx_buf_len_ = 0;
|
||||
@@ -243,18 +229,10 @@ class APIFrameHelper {
|
||||
uint8_t tx_buf_head_{0};
|
||||
uint8_t tx_buf_tail_{0};
|
||||
uint8_t tx_buf_count_{0};
|
||||
// Nagle batching state for log messages. NODELAY_ON (-1) means NODELAY is enabled
|
||||
// (immediate send). Values 1-2 count log messages in the current Nagle batch.
|
||||
// After LOG_NAGLE_COUNT logs, we switch to NODELAY to flush and reset.
|
||||
static constexpr int8_t NODELAY_ON = -1;
|
||||
static constexpr int8_t LOG_NAGLE_COUNT = 2;
|
||||
int8_t nodelay_state_{NODELAY_ON};
|
||||
|
||||
// Internal helper to set TCP_NODELAY socket option
|
||||
void set_nodelay_raw_(bool enable) {
|
||||
int val = enable ? 1 : 0;
|
||||
this->socket_->setsockopt(IPPROTO_TCP, TCP_NODELAY, &val, sizeof(int));
|
||||
}
|
||||
// Tracks TCP_NODELAY state to minimize setsockopt() calls. Initialized to true
|
||||
// since init_common_() enables NODELAY. Used by set_nodelay() to allow log
|
||||
// messages to coalesce while keeping state updates low-latency.
|
||||
bool nodelay_enabled_{true};
|
||||
|
||||
// Common initialization for both plaintext and noise protocols
|
||||
APIError init_common_();
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
#ifdef USE_API_NOISE
|
||||
#include "api_connection.h" // For ClientInfo struct
|
||||
#include "esphome/core/application.h"
|
||||
#include "esphome/core/entity_base.h"
|
||||
#include "esphome/core/hal.h"
|
||||
#include "esphome/core/helpers.h"
|
||||
#include "esphome/core/log.h"
|
||||
@@ -29,12 +28,7 @@ static constexpr size_t PROLOGUE_INIT_LEN = 12; // strlen("NoiseAPIInit")
|
||||
static constexpr size_t API_MAX_LOG_BYTES = 168;
|
||||
|
||||
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERY_VERBOSE
|
||||
#define HELPER_LOG(msg, ...) \
|
||||
do { \
|
||||
char peername_buf[socket::SOCKADDR_STR_LEN]; \
|
||||
this->get_peername_to(peername_buf); \
|
||||
ESP_LOGVV(TAG, "%s (%s): " msg, this->client_name_, peername_buf, ##__VA_ARGS__); \
|
||||
} while (0)
|
||||
#define HELPER_LOG(msg, ...) ESP_LOGVV(TAG, "%s (%s): " msg, this->client_name_, this->client_peername_, ##__VA_ARGS__)
|
||||
#else
|
||||
#define HELPER_LOG(msg, ...) ((void) 0)
|
||||
#endif
|
||||
@@ -262,30 +256,28 @@ APIError APINoiseFrameHelper::state_action_() {
|
||||
}
|
||||
if (state_ == State::SERVER_HELLO) {
|
||||
// send server hello
|
||||
constexpr size_t mac_len = 13; // 12 hex chars + null terminator
|
||||
const std::string &name = App.get_name();
|
||||
char mac[MAC_ADDRESS_BUFFER_SIZE];
|
||||
char mac[mac_len];
|
||||
get_mac_address_into_buffer(mac);
|
||||
|
||||
// Calculate positions and sizes
|
||||
size_t name_len = name.size() + 1; // including null terminator
|
||||
size_t name_offset = 1;
|
||||
size_t mac_offset = name_offset + name_len;
|
||||
size_t total_size = 1 + name_len + MAC_ADDRESS_BUFFER_SIZE;
|
||||
size_t total_size = 1 + name_len + mac_len;
|
||||
|
||||
// 1 (proto) + name (max ESPHOME_DEVICE_NAME_MAX_LEN) + 1 (name null)
|
||||
// + mac (MAC_ADDRESS_BUFFER_SIZE - 1) + 1 (mac null)
|
||||
constexpr size_t max_msg_size = 1 + ESPHOME_DEVICE_NAME_MAX_LEN + 1 + MAC_ADDRESS_BUFFER_SIZE;
|
||||
uint8_t msg[max_msg_size];
|
||||
auto msg = std::make_unique<uint8_t[]>(total_size);
|
||||
|
||||
// chosen proto
|
||||
msg[0] = 0x01;
|
||||
|
||||
// node name, terminated by null byte
|
||||
std::memcpy(msg + name_offset, name.c_str(), name_len);
|
||||
std::memcpy(msg.get() + name_offset, name.c_str(), name_len);
|
||||
// node mac, terminated by null byte
|
||||
std::memcpy(msg + mac_offset, mac, MAC_ADDRESS_BUFFER_SIZE);
|
||||
std::memcpy(msg.get() + mac_offset, mac, mac_len);
|
||||
|
||||
aerr = write_frame_(msg, total_size);
|
||||
aerr = write_frame_(msg.get(), total_size);
|
||||
if (aerr != APIError::OK)
|
||||
return aerr;
|
||||
|
||||
@@ -361,32 +353,35 @@ APIError APINoiseFrameHelper::state_action_() {
|
||||
return APIError::OK;
|
||||
}
|
||||
void APINoiseFrameHelper::send_explicit_handshake_reject_(const LogString *reason) {
|
||||
// Max reject message: "Bad handshake packet len" (24) + 1 (failure byte) = 25 bytes
|
||||
uint8_t data[32];
|
||||
data[0] = 0x01; // failure
|
||||
|
||||
#ifdef USE_STORE_LOG_STR_IN_FLASH
|
||||
// On ESP8266 with flash strings, we need to use PROGMEM-aware functions
|
||||
size_t reason_len = strlen_P(reinterpret_cast<PGM_P>(reason));
|
||||
size_t data_size = reason_len + 1;
|
||||
auto data = std::make_unique<uint8_t[]>(data_size);
|
||||
data[0] = 0x01; // failure
|
||||
|
||||
// Copy error message from PROGMEM
|
||||
if (reason_len > 0) {
|
||||
memcpy_P(data + 1, reinterpret_cast<PGM_P>(reason), reason_len);
|
||||
memcpy_P(data.get() + 1, reinterpret_cast<PGM_P>(reason), reason_len);
|
||||
}
|
||||
#else
|
||||
// Normal memory access
|
||||
const char *reason_str = LOG_STR_ARG(reason);
|
||||
size_t reason_len = strlen(reason_str);
|
||||
size_t data_size = reason_len + 1;
|
||||
auto data = std::make_unique<uint8_t[]>(data_size);
|
||||
data[0] = 0x01; // failure
|
||||
|
||||
// Copy error message in bulk
|
||||
if (reason_len > 0) {
|
||||
// NOLINTNEXTLINE(bugprone-not-null-terminated-result) - binary protocol, not a C string
|
||||
std::memcpy(data + 1, reason_str, reason_len);
|
||||
std::memcpy(data.get() + 1, reason_str, reason_len);
|
||||
}
|
||||
#endif
|
||||
|
||||
size_t data_size = reason_len + 1;
|
||||
|
||||
// temporarily remove failed state
|
||||
auto orig_state = state_;
|
||||
state_ = State::EXPLICIT_REJECT;
|
||||
write_frame_(data, data_size);
|
||||
write_frame_(data.get(), data_size);
|
||||
state_ = orig_state;
|
||||
}
|
||||
APIError APINoiseFrameHelper::read_packet(ReadPacketBuffer *buffer) {
|
||||
|
||||
@@ -21,12 +21,7 @@ static const char *const TAG = "api.plaintext";
|
||||
static constexpr size_t API_MAX_LOG_BYTES = 168;
|
||||
|
||||
#if ESPHOME_LOG_LEVEL >= ESPHOME_LOG_LEVEL_VERY_VERBOSE
|
||||
#define HELPER_LOG(msg, ...) \
|
||||
do { \
|
||||
char peername_buf[socket::SOCKADDR_STR_LEN]; \
|
||||
this->get_peername_to(peername_buf); \
|
||||
ESP_LOGVV(TAG, "%s (%s): " msg, this->client_name_, peername_buf, ##__VA_ARGS__); \
|
||||
} while (0)
|
||||
#define HELPER_LOG(msg, ...) ESP_LOGVV(TAG, "%s (%s): " msg, this->client_name_, this->client_peername_, ##__VA_ARGS__)
|
||||
#else
|
||||
#define HELPER_LOG(msg, ...) ((void) 0)
|
||||
#endif
|
||||
|
||||
@@ -27,6 +27,7 @@ extend google.protobuf.MessageOptions {
|
||||
extend google.protobuf.FieldOptions {
|
||||
optional string field_ifdef = 1042;
|
||||
optional uint32 fixed_array_size = 50007;
|
||||
optional bool no_zero_copy = 50008 [default=false];
|
||||
optional bool fixed_array_skip_zero = 50009 [default=false];
|
||||
optional string fixed_array_size_define = 50010;
|
||||
optional string fixed_array_with_length_define = 50011;
|
||||
@@ -79,15 +80,4 @@ extend google.protobuf.FieldOptions {
|
||||
// Example: [(container_pointer_no_template) = "light::ColorModeMask"]
|
||||
// generates: const light::ColorModeMask *supported_color_modes{};
|
||||
optional string container_pointer_no_template = 50014;
|
||||
|
||||
// packed_buffer: Expose raw packed buffer instead of decoding into container
|
||||
// When set on a packed repeated field, the generated code stores a pointer
|
||||
// to the raw protobuf buffer instead of decoding values. This enables
|
||||
// zero-copy passthrough when the consumer can decode on-demand.
|
||||
// The field must be a packed repeated field (packed=true).
|
||||
// Generates three fields:
|
||||
// - const uint8_t *<field>_data_{nullptr};
|
||||
// - uint16_t <field>_length_{0};
|
||||
// - uint16_t <field>_count_{0};
|
||||
optional bool packed_buffer = 50015 [default=false];
|
||||
}
|
||||
|
||||
@@ -3347,98 +3347,5 @@ void ZWaveProxyRequest::calculate_size(ProtoSize &size) const {
|
||||
size.add_length(1, this->data_len);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_INFRARED
|
||||
void ListEntitiesInfraredResponse::encode(ProtoWriteBuffer buffer) const {
|
||||
buffer.encode_string(1, this->object_id);
|
||||
buffer.encode_fixed32(2, this->key);
|
||||
buffer.encode_string(3, this->name);
|
||||
#ifdef USE_ENTITY_ICON
|
||||
buffer.encode_string(4, this->icon);
|
||||
#endif
|
||||
buffer.encode_bool(5, this->disabled_by_default);
|
||||
buffer.encode_uint32(6, static_cast<uint32_t>(this->entity_category));
|
||||
#ifdef USE_DEVICES
|
||||
buffer.encode_uint32(7, this->device_id);
|
||||
#endif
|
||||
buffer.encode_uint32(8, this->capabilities);
|
||||
}
|
||||
void ListEntitiesInfraredResponse::calculate_size(ProtoSize &size) const {
|
||||
size.add_length(1, this->object_id.size());
|
||||
size.add_fixed32(1, this->key);
|
||||
size.add_length(1, this->name.size());
|
||||
#ifdef USE_ENTITY_ICON
|
||||
size.add_length(1, this->icon.size());
|
||||
#endif
|
||||
size.add_bool(1, this->disabled_by_default);
|
||||
size.add_uint32(1, static_cast<uint32_t>(this->entity_category));
|
||||
#ifdef USE_DEVICES
|
||||
size.add_uint32(1, this->device_id);
|
||||
#endif
|
||||
size.add_uint32(1, this->capabilities);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_IR_RF
|
||||
bool InfraredRFTransmitRawTimingsRequest::decode_varint(uint32_t field_id, ProtoVarInt value) {
|
||||
switch (field_id) {
|
||||
#ifdef USE_DEVICES
|
||||
case 1:
|
||||
this->device_id = value.as_uint32();
|
||||
break;
|
||||
#endif
|
||||
case 3:
|
||||
this->carrier_frequency = value.as_uint32();
|
||||
break;
|
||||
case 4:
|
||||
this->repeat_count = value.as_uint32();
|
||||
break;
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
bool InfraredRFTransmitRawTimingsRequest::decode_length(uint32_t field_id, ProtoLengthDelimited value) {
|
||||
switch (field_id) {
|
||||
case 5: {
|
||||
this->timings_data_ = value.data();
|
||||
this->timings_length_ = value.size();
|
||||
this->timings_count_ = count_packed_varints(value.data(), value.size());
|
||||
break;
|
||||
}
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
bool InfraredRFTransmitRawTimingsRequest::decode_32bit(uint32_t field_id, Proto32Bit value) {
|
||||
switch (field_id) {
|
||||
case 2:
|
||||
this->key = value.as_fixed32();
|
||||
break;
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
void InfraredRFReceiveEvent::encode(ProtoWriteBuffer buffer) const {
|
||||
#ifdef USE_DEVICES
|
||||
buffer.encode_uint32(1, this->device_id);
|
||||
#endif
|
||||
buffer.encode_fixed32(2, this->key);
|
||||
for (const auto &it : *this->timings) {
|
||||
buffer.encode_sint32(3, it, true);
|
||||
}
|
||||
}
|
||||
void InfraredRFReceiveEvent::calculate_size(ProtoSize &size) const {
|
||||
#ifdef USE_DEVICES
|
||||
size.add_uint32(1, this->device_id);
|
||||
#endif
|
||||
size.add_fixed32(1, this->key);
|
||||
if (!this->timings->empty()) {
|
||||
for (const auto &it : *this->timings) {
|
||||
size.add_sint32_force(1, it);
|
||||
}
|
||||
}
|
||||
}
|
||||
#endif
|
||||
|
||||
} // namespace esphome::api
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -8,100 +8,90 @@ namespace esphome::api {
|
||||
static const char *const TAG = "api.service";
|
||||
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
void APIServerConnectionBase::log_send_message_(const char *name, const char *dump) {
|
||||
ESP_LOGVV(TAG, "send_message %s: %s", name, dump);
|
||||
}
|
||||
void APIServerConnectionBase::log_receive_message_(const LogString *name, const ProtoMessage &msg) {
|
||||
DumpBuffer dump_buf;
|
||||
ESP_LOGVV(TAG, "%s: %s", LOG_STR_ARG(name), msg.dump_to(dump_buf));
|
||||
}
|
||||
void APIServerConnectionBase::log_receive_message_(const LogString *name) {
|
||||
ESP_LOGVV(TAG, "%s: {}", LOG_STR_ARG(name));
|
||||
void APIServerConnectionBase::log_send_message_(const char *name, const std::string &dump) {
|
||||
ESP_LOGVV(TAG, "send_message %s: %s", name, dump.c_str());
|
||||
}
|
||||
#endif
|
||||
|
||||
void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type, const uint8_t *msg_data) {
|
||||
// Check authentication/connection requirements
|
||||
switch (msg_type) {
|
||||
case HelloRequest::MESSAGE_TYPE: // No setup required
|
||||
case DisconnectRequest::MESSAGE_TYPE: // No setup required
|
||||
case PingRequest::MESSAGE_TYPE: // No setup required
|
||||
break;
|
||||
case DeviceInfoRequest::MESSAGE_TYPE: // Connection setup only
|
||||
if (!this->check_connection_setup_()) {
|
||||
return;
|
||||
}
|
||||
break;
|
||||
default:
|
||||
if (!this->check_authenticated_()) {
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
switch (msg_type) {
|
||||
case HelloRequest::MESSAGE_TYPE: {
|
||||
HelloRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_hello_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_hello_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_hello_request(msg);
|
||||
break;
|
||||
}
|
||||
case DisconnectRequest::MESSAGE_TYPE: {
|
||||
DisconnectRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_disconnect_request"));
|
||||
ESP_LOGVV(TAG, "on_disconnect_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_disconnect_request();
|
||||
this->on_disconnect_request(msg);
|
||||
break;
|
||||
}
|
||||
case DisconnectResponse::MESSAGE_TYPE: {
|
||||
DisconnectResponse msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_disconnect_response"));
|
||||
ESP_LOGVV(TAG, "on_disconnect_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_disconnect_response();
|
||||
this->on_disconnect_response(msg);
|
||||
break;
|
||||
}
|
||||
case PingRequest::MESSAGE_TYPE: {
|
||||
PingRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_ping_request"));
|
||||
ESP_LOGVV(TAG, "on_ping_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_ping_request();
|
||||
this->on_ping_request(msg);
|
||||
break;
|
||||
}
|
||||
case PingResponse::MESSAGE_TYPE: {
|
||||
PingResponse msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_ping_response"));
|
||||
ESP_LOGVV(TAG, "on_ping_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_ping_response();
|
||||
this->on_ping_response(msg);
|
||||
break;
|
||||
}
|
||||
case DeviceInfoRequest::MESSAGE_TYPE: {
|
||||
DeviceInfoRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_device_info_request"));
|
||||
ESP_LOGVV(TAG, "on_device_info_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_device_info_request();
|
||||
this->on_device_info_request(msg);
|
||||
break;
|
||||
}
|
||||
case ListEntitiesRequest::MESSAGE_TYPE: {
|
||||
ListEntitiesRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_list_entities_request"));
|
||||
ESP_LOGVV(TAG, "on_list_entities_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_list_entities_request();
|
||||
this->on_list_entities_request(msg);
|
||||
break;
|
||||
}
|
||||
case SubscribeStatesRequest::MESSAGE_TYPE: {
|
||||
SubscribeStatesRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_subscribe_states_request"));
|
||||
ESP_LOGVV(TAG, "on_subscribe_states_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_subscribe_states_request();
|
||||
this->on_subscribe_states_request(msg);
|
||||
break;
|
||||
}
|
||||
case SubscribeLogsRequest::MESSAGE_TYPE: {
|
||||
SubscribeLogsRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_subscribe_logs_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_subscribe_logs_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_subscribe_logs_request(msg);
|
||||
break;
|
||||
@@ -111,7 +101,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
CoverCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_cover_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_cover_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_cover_command_request(msg);
|
||||
break;
|
||||
@@ -122,7 +112,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
FanCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_fan_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_fan_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_fan_command_request(msg);
|
||||
break;
|
||||
@@ -133,7 +123,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
LightCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_light_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_light_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_light_command_request(msg);
|
||||
break;
|
||||
@@ -144,7 +134,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
SwitchCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_switch_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_switch_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_switch_command_request(msg);
|
||||
break;
|
||||
@@ -152,10 +142,12 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
#endif
|
||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
||||
case SubscribeHomeassistantServicesRequest::MESSAGE_TYPE: {
|
||||
SubscribeHomeassistantServicesRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_subscribe_homeassistant_services_request"));
|
||||
ESP_LOGVV(TAG, "on_subscribe_homeassistant_services_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_subscribe_homeassistant_services_request();
|
||||
this->on_subscribe_homeassistant_services_request(msg);
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
@@ -163,17 +155,19 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
GetTimeResponse msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_get_time_response"), msg);
|
||||
ESP_LOGVV(TAG, "on_get_time_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_get_time_response(msg);
|
||||
break;
|
||||
}
|
||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
||||
case SubscribeHomeAssistantStatesRequest::MESSAGE_TYPE: {
|
||||
SubscribeHomeAssistantStatesRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_subscribe_home_assistant_states_request"));
|
||||
ESP_LOGVV(TAG, "on_subscribe_home_assistant_states_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_subscribe_home_assistant_states_request();
|
||||
this->on_subscribe_home_assistant_states_request(msg);
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
@@ -182,7 +176,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
HomeAssistantStateResponse msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_home_assistant_state_response"), msg);
|
||||
ESP_LOGVV(TAG, "on_home_assistant_state_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_home_assistant_state_response(msg);
|
||||
break;
|
||||
@@ -193,7 +187,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
ExecuteServiceRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_execute_service_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_execute_service_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_execute_service_request(msg);
|
||||
break;
|
||||
@@ -204,7 +198,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
CameraImageRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_camera_image_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_camera_image_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_camera_image_request(msg);
|
||||
break;
|
||||
@@ -215,7 +209,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
ClimateCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_climate_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_climate_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_climate_command_request(msg);
|
||||
break;
|
||||
@@ -226,7 +220,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
NumberCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_number_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_number_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_number_command_request(msg);
|
||||
break;
|
||||
@@ -237,7 +231,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
SelectCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_select_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_select_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_select_command_request(msg);
|
||||
break;
|
||||
@@ -248,7 +242,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
SirenCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_siren_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_siren_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_siren_command_request(msg);
|
||||
break;
|
||||
@@ -259,7 +253,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
LockCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_lock_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_lock_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_lock_command_request(msg);
|
||||
break;
|
||||
@@ -270,7 +264,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
ButtonCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_button_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_button_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_button_command_request(msg);
|
||||
break;
|
||||
@@ -281,7 +275,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
MediaPlayerCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_media_player_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_media_player_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_media_player_command_request(msg);
|
||||
break;
|
||||
@@ -292,7 +286,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
SubscribeBluetoothLEAdvertisementsRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_subscribe_bluetooth_le_advertisements_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_subscribe_bluetooth_le_advertisements_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_subscribe_bluetooth_le_advertisements_request(msg);
|
||||
break;
|
||||
@@ -303,7 +297,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothDeviceRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_device_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_device_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_device_request(msg);
|
||||
break;
|
||||
@@ -314,7 +308,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothGATTGetServicesRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_gatt_get_services_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_gatt_get_services_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_gatt_get_services_request(msg);
|
||||
break;
|
||||
@@ -325,7 +319,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothGATTReadRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_gatt_read_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_gatt_read_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_gatt_read_request(msg);
|
||||
break;
|
||||
@@ -336,7 +330,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothGATTWriteRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_gatt_write_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_gatt_write_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_gatt_write_request(msg);
|
||||
break;
|
||||
@@ -347,7 +341,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothGATTReadDescriptorRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_gatt_read_descriptor_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_gatt_read_descriptor_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_gatt_read_descriptor_request(msg);
|
||||
break;
|
||||
@@ -358,7 +352,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothGATTWriteDescriptorRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_gatt_write_descriptor_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_gatt_write_descriptor_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_gatt_write_descriptor_request(msg);
|
||||
break;
|
||||
@@ -369,7 +363,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothGATTNotifyRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_gatt_notify_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_gatt_notify_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_gatt_notify_request(msg);
|
||||
break;
|
||||
@@ -377,19 +371,23 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
case SubscribeBluetoothConnectionsFreeRequest::MESSAGE_TYPE: {
|
||||
SubscribeBluetoothConnectionsFreeRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_subscribe_bluetooth_connections_free_request"));
|
||||
ESP_LOGVV(TAG, "on_subscribe_bluetooth_connections_free_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_subscribe_bluetooth_connections_free_request();
|
||||
this->on_subscribe_bluetooth_connections_free_request(msg);
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
case UnsubscribeBluetoothLEAdvertisementsRequest::MESSAGE_TYPE: {
|
||||
UnsubscribeBluetoothLEAdvertisementsRequest msg;
|
||||
// Empty message: no decode needed
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_unsubscribe_bluetooth_le_advertisements_request"));
|
||||
ESP_LOGVV(TAG, "on_unsubscribe_bluetooth_le_advertisements_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_unsubscribe_bluetooth_le_advertisements_request();
|
||||
this->on_unsubscribe_bluetooth_le_advertisements_request(msg);
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
@@ -398,7 +396,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
SubscribeVoiceAssistantRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_subscribe_voice_assistant_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_subscribe_voice_assistant_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_subscribe_voice_assistant_request(msg);
|
||||
break;
|
||||
@@ -409,7 +407,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
VoiceAssistantResponse msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_voice_assistant_response"), msg);
|
||||
ESP_LOGVV(TAG, "on_voice_assistant_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_voice_assistant_response(msg);
|
||||
break;
|
||||
@@ -420,7 +418,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
VoiceAssistantEventResponse msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_voice_assistant_event_response"), msg);
|
||||
ESP_LOGVV(TAG, "on_voice_assistant_event_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_voice_assistant_event_response(msg);
|
||||
break;
|
||||
@@ -431,7 +429,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
AlarmControlPanelCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_alarm_control_panel_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_alarm_control_panel_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_alarm_control_panel_command_request(msg);
|
||||
break;
|
||||
@@ -442,7 +440,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
TextCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_text_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_text_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_text_command_request(msg);
|
||||
break;
|
||||
@@ -453,7 +451,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
DateCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_date_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_date_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_date_command_request(msg);
|
||||
break;
|
||||
@@ -464,7 +462,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
TimeCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_time_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_time_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_time_command_request(msg);
|
||||
break;
|
||||
@@ -475,7 +473,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
VoiceAssistantAudio msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_voice_assistant_audio"), msg);
|
||||
ESP_LOGVV(TAG, "on_voice_assistant_audio: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_voice_assistant_audio(msg);
|
||||
break;
|
||||
@@ -486,7 +484,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
ValveCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_valve_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_valve_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_valve_command_request(msg);
|
||||
break;
|
||||
@@ -497,7 +495,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
DateTimeCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_date_time_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_date_time_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_date_time_command_request(msg);
|
||||
break;
|
||||
@@ -508,7 +506,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
VoiceAssistantTimerEventResponse msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_voice_assistant_timer_event_response"), msg);
|
||||
ESP_LOGVV(TAG, "on_voice_assistant_timer_event_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_voice_assistant_timer_event_response(msg);
|
||||
break;
|
||||
@@ -519,7 +517,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
UpdateCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_update_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_update_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_update_command_request(msg);
|
||||
break;
|
||||
@@ -530,7 +528,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
VoiceAssistantAnnounceRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_voice_assistant_announce_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_voice_assistant_announce_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_voice_assistant_announce_request(msg);
|
||||
break;
|
||||
@@ -541,7 +539,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
VoiceAssistantConfigurationRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_voice_assistant_configuration_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_voice_assistant_configuration_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_voice_assistant_configuration_request(msg);
|
||||
break;
|
||||
@@ -552,7 +550,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
VoiceAssistantSetConfiguration msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_voice_assistant_set_configuration"), msg);
|
||||
ESP_LOGVV(TAG, "on_voice_assistant_set_configuration: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_voice_assistant_set_configuration(msg);
|
||||
break;
|
||||
@@ -563,7 +561,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
NoiseEncryptionSetKeyRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_noise_encryption_set_key_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_noise_encryption_set_key_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_noise_encryption_set_key_request(msg);
|
||||
break;
|
||||
@@ -574,7 +572,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
BluetoothScannerSetModeRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_bluetooth_scanner_set_mode_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_bluetooth_scanner_set_mode_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_bluetooth_scanner_set_mode_request(msg);
|
||||
break;
|
||||
@@ -585,7 +583,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
ZWaveProxyFrame msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_z_wave_proxy_frame"), msg);
|
||||
ESP_LOGVV(TAG, "on_z_wave_proxy_frame: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_z_wave_proxy_frame(msg);
|
||||
break;
|
||||
@@ -596,7 +594,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
ZWaveProxyRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_z_wave_proxy_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_z_wave_proxy_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_z_wave_proxy_request(msg);
|
||||
break;
|
||||
@@ -607,7 +605,7 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
HomeassistantActionResponse msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_homeassistant_action_response"), msg);
|
||||
ESP_LOGVV(TAG, "on_homeassistant_action_response: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_homeassistant_action_response(msg);
|
||||
break;
|
||||
@@ -618,26 +616,232 @@ void APIServerConnectionBase::read_message(uint32_t msg_size, uint32_t msg_type,
|
||||
WaterHeaterCommandRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_water_heater_command_request"), msg);
|
||||
ESP_LOGVV(TAG, "on_water_heater_command_request: %s", msg.dump().c_str());
|
||||
#endif
|
||||
this->on_water_heater_command_request(msg);
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_IR_RF
|
||||
case InfraredRFTransmitRawTimingsRequest::MESSAGE_TYPE: {
|
||||
InfraredRFTransmitRawTimingsRequest msg;
|
||||
msg.decode(msg_data, msg_size);
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
this->log_receive_message_(LOG_STR("on_infrared_rf_transmit_raw_timings_request"), msg);
|
||||
#endif
|
||||
this->on_infrared_rf_transmit_raw_timings_request(msg);
|
||||
break;
|
||||
}
|
||||
#endif
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
void APIServerConnection::on_hello_request(const HelloRequest &msg) {
|
||||
if (!this->send_hello_response(msg)) {
|
||||
this->on_fatal_error();
|
||||
}
|
||||
}
|
||||
void APIServerConnection::on_disconnect_request(const DisconnectRequest &msg) {
|
||||
if (!this->send_disconnect_response(msg)) {
|
||||
this->on_fatal_error();
|
||||
}
|
||||
}
|
||||
void APIServerConnection::on_ping_request(const PingRequest &msg) {
|
||||
if (!this->send_ping_response(msg)) {
|
||||
this->on_fatal_error();
|
||||
}
|
||||
}
|
||||
void APIServerConnection::on_device_info_request(const DeviceInfoRequest &msg) {
|
||||
if (!this->send_device_info_response(msg)) {
|
||||
this->on_fatal_error();
|
||||
}
|
||||
}
|
||||
void APIServerConnection::on_list_entities_request(const ListEntitiesRequest &msg) { this->list_entities(msg); }
|
||||
void APIServerConnection::on_subscribe_states_request(const SubscribeStatesRequest &msg) {
|
||||
this->subscribe_states(msg);
|
||||
}
|
||||
void APIServerConnection::on_subscribe_logs_request(const SubscribeLogsRequest &msg) { this->subscribe_logs(msg); }
|
||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
||||
void APIServerConnection::on_subscribe_homeassistant_services_request(
|
||||
const SubscribeHomeassistantServicesRequest &msg) {
|
||||
this->subscribe_homeassistant_services(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
||||
void APIServerConnection::on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) {
|
||||
this->subscribe_home_assistant_states(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_API_USER_DEFINED_ACTIONS
|
||||
void APIServerConnection::on_execute_service_request(const ExecuteServiceRequest &msg) { this->execute_service(msg); }
|
||||
#endif
|
||||
#ifdef USE_API_NOISE
|
||||
void APIServerConnection::on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &msg) {
|
||||
if (!this->send_noise_encryption_set_key_response(msg)) {
|
||||
this->on_fatal_error();
|
||||
}
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BUTTON
|
||||
void APIServerConnection::on_button_command_request(const ButtonCommandRequest &msg) { this->button_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_CAMERA
|
||||
void APIServerConnection::on_camera_image_request(const CameraImageRequest &msg) { this->camera_image(msg); }
|
||||
#endif
|
||||
#ifdef USE_CLIMATE
|
||||
void APIServerConnection::on_climate_command_request(const ClimateCommandRequest &msg) { this->climate_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_COVER
|
||||
void APIServerConnection::on_cover_command_request(const CoverCommandRequest &msg) { this->cover_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATE
|
||||
void APIServerConnection::on_date_command_request(const DateCommandRequest &msg) { this->date_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATETIME
|
||||
void APIServerConnection::on_date_time_command_request(const DateTimeCommandRequest &msg) {
|
||||
this->datetime_command(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_FAN
|
||||
void APIServerConnection::on_fan_command_request(const FanCommandRequest &msg) { this->fan_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_LIGHT
|
||||
void APIServerConnection::on_light_command_request(const LightCommandRequest &msg) { this->light_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_LOCK
|
||||
void APIServerConnection::on_lock_command_request(const LockCommandRequest &msg) { this->lock_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_MEDIA_PLAYER
|
||||
void APIServerConnection::on_media_player_command_request(const MediaPlayerCommandRequest &msg) {
|
||||
this->media_player_command(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_NUMBER
|
||||
void APIServerConnection::on_number_command_request(const NumberCommandRequest &msg) { this->number_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_SELECT
|
||||
void APIServerConnection::on_select_command_request(const SelectCommandRequest &msg) { this->select_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_SIREN
|
||||
void APIServerConnection::on_siren_command_request(const SirenCommandRequest &msg) { this->siren_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_SWITCH
|
||||
void APIServerConnection::on_switch_command_request(const SwitchCommandRequest &msg) { this->switch_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_TEXT
|
||||
void APIServerConnection::on_text_command_request(const TextCommandRequest &msg) { this->text_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_DATETIME_TIME
|
||||
void APIServerConnection::on_time_command_request(const TimeCommandRequest &msg) { this->time_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_UPDATE
|
||||
void APIServerConnection::on_update_command_request(const UpdateCommandRequest &msg) { this->update_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_VALVE
|
||||
void APIServerConnection::on_valve_command_request(const ValveCommandRequest &msg) { this->valve_command(msg); }
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_subscribe_bluetooth_le_advertisements_request(
|
||||
const SubscribeBluetoothLEAdvertisementsRequest &msg) {
|
||||
this->subscribe_bluetooth_le_advertisements(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_device_request(const BluetoothDeviceRequest &msg) {
|
||||
this->bluetooth_device_request(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_gatt_get_services_request(const BluetoothGATTGetServicesRequest &msg) {
|
||||
this->bluetooth_gatt_get_services(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_gatt_read_request(const BluetoothGATTReadRequest &msg) {
|
||||
this->bluetooth_gatt_read(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_gatt_write_request(const BluetoothGATTWriteRequest &msg) {
|
||||
this->bluetooth_gatt_write(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_gatt_read_descriptor_request(const BluetoothGATTReadDescriptorRequest &msg) {
|
||||
this->bluetooth_gatt_read_descriptor(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_gatt_write_descriptor_request(const BluetoothGATTWriteDescriptorRequest &msg) {
|
||||
this->bluetooth_gatt_write_descriptor(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_gatt_notify_request(const BluetoothGATTNotifyRequest &msg) {
|
||||
this->bluetooth_gatt_notify(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_subscribe_bluetooth_connections_free_request(
|
||||
const SubscribeBluetoothConnectionsFreeRequest &msg) {
|
||||
if (!this->send_subscribe_bluetooth_connections_free_response(msg)) {
|
||||
this->on_fatal_error();
|
||||
}
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_unsubscribe_bluetooth_le_advertisements_request(
|
||||
const UnsubscribeBluetoothLEAdvertisementsRequest &msg) {
|
||||
this->unsubscribe_bluetooth_le_advertisements(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void APIServerConnection::on_bluetooth_scanner_set_mode_request(const BluetoothScannerSetModeRequest &msg) {
|
||||
this->bluetooth_scanner_set_mode(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
void APIServerConnection::on_subscribe_voice_assistant_request(const SubscribeVoiceAssistantRequest &msg) {
|
||||
this->subscribe_voice_assistant(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
void APIServerConnection::on_voice_assistant_configuration_request(const VoiceAssistantConfigurationRequest &msg) {
|
||||
if (!this->send_voice_assistant_get_configuration_response(msg)) {
|
||||
this->on_fatal_error();
|
||||
}
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
void APIServerConnection::on_voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) {
|
||||
this->voice_assistant_set_configuration(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_ALARM_CONTROL_PANEL
|
||||
void APIServerConnection::on_alarm_control_panel_command_request(const AlarmControlPanelCommandRequest &msg) {
|
||||
this->alarm_control_panel_command(msg);
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
void APIServerConnection::on_z_wave_proxy_frame(const ZWaveProxyFrame &msg) { this->zwave_proxy_frame(msg); }
|
||||
#endif
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
void APIServerConnection::on_z_wave_proxy_request(const ZWaveProxyRequest &msg) { this->zwave_proxy_request(msg); }
|
||||
#endif
|
||||
|
||||
void APIServerConnection::read_message(uint32_t msg_size, uint32_t msg_type, const uint8_t *msg_data) {
|
||||
// Check authentication/connection requirements for messages
|
||||
switch (msg_type) {
|
||||
case HelloRequest::MESSAGE_TYPE: // No setup required
|
||||
case DisconnectRequest::MESSAGE_TYPE: // No setup required
|
||||
case PingRequest::MESSAGE_TYPE: // No setup required
|
||||
break; // Skip all checks for these messages
|
||||
case DeviceInfoRequest::MESSAGE_TYPE: // Connection setup only
|
||||
if (!this->check_connection_setup_()) {
|
||||
return; // Connection not setup
|
||||
}
|
||||
break;
|
||||
default:
|
||||
// All other messages require authentication (which includes connection check)
|
||||
if (!this->check_authenticated_()) {
|
||||
return; // Authentication failed
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
// Call base implementation to process the message
|
||||
APIServerConnectionBase::read_message(msg_size, msg_type, msg_data);
|
||||
}
|
||||
|
||||
} // namespace esphome::api
|
||||
|
||||
@@ -12,32 +12,29 @@ class APIServerConnectionBase : public ProtoService {
|
||||
public:
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
protected:
|
||||
void log_send_message_(const char *name, const char *dump);
|
||||
void log_receive_message_(const LogString *name, const ProtoMessage &msg);
|
||||
void log_receive_message_(const LogString *name);
|
||||
void log_send_message_(const char *name, const std::string &dump);
|
||||
|
||||
public:
|
||||
#endif
|
||||
|
||||
bool send_message(const ProtoMessage &msg, uint8_t message_type) {
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
DumpBuffer dump_buf;
|
||||
this->log_send_message_(msg.message_name(), msg.dump_to(dump_buf));
|
||||
this->log_send_message_(msg.message_name(), msg.dump());
|
||||
#endif
|
||||
return this->send_message_impl(msg, message_type);
|
||||
return this->send_message_(msg, message_type);
|
||||
}
|
||||
|
||||
virtual void on_hello_request(const HelloRequest &value){};
|
||||
|
||||
virtual void on_disconnect_request(){};
|
||||
virtual void on_disconnect_response(){};
|
||||
virtual void on_ping_request(){};
|
||||
virtual void on_ping_response(){};
|
||||
virtual void on_device_info_request(){};
|
||||
virtual void on_disconnect_request(const DisconnectRequest &value){};
|
||||
virtual void on_disconnect_response(const DisconnectResponse &value){};
|
||||
virtual void on_ping_request(const PingRequest &value){};
|
||||
virtual void on_ping_response(const PingResponse &value){};
|
||||
virtual void on_device_info_request(const DeviceInfoRequest &value){};
|
||||
|
||||
virtual void on_list_entities_request(){};
|
||||
virtual void on_list_entities_request(const ListEntitiesRequest &value){};
|
||||
|
||||
virtual void on_subscribe_states_request(){};
|
||||
virtual void on_subscribe_states_request(const SubscribeStatesRequest &value){};
|
||||
|
||||
#ifdef USE_COVER
|
||||
virtual void on_cover_command_request(const CoverCommandRequest &value){};
|
||||
@@ -62,14 +59,14 @@ class APIServerConnectionBase : public ProtoService {
|
||||
#endif
|
||||
|
||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
||||
virtual void on_subscribe_homeassistant_services_request(){};
|
||||
virtual void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &value){};
|
||||
#endif
|
||||
|
||||
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
|
||||
virtual void on_homeassistant_action_response(const HomeassistantActionResponse &value){};
|
||||
#endif
|
||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
||||
virtual void on_subscribe_home_assistant_states_request(){};
|
||||
virtual void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &value){};
|
||||
#endif
|
||||
|
||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
||||
@@ -148,11 +145,12 @@ class APIServerConnectionBase : public ProtoService {
|
||||
#endif
|
||||
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void on_subscribe_bluetooth_connections_free_request(){};
|
||||
virtual void on_subscribe_bluetooth_connections_free_request(const SubscribeBluetoothConnectionsFreeRequest &value){};
|
||||
#endif
|
||||
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void on_unsubscribe_bluetooth_le_advertisements_request(){};
|
||||
virtual void on_unsubscribe_bluetooth_le_advertisements_request(
|
||||
const UnsubscribeBluetoothLEAdvertisementsRequest &value){};
|
||||
#endif
|
||||
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
@@ -219,13 +217,264 @@ class APIServerConnectionBase : public ProtoService {
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
virtual void on_z_wave_proxy_request(const ZWaveProxyRequest &value){};
|
||||
#endif
|
||||
|
||||
#ifdef USE_IR_RF
|
||||
virtual void on_infrared_rf_transmit_raw_timings_request(const InfraredRFTransmitRawTimingsRequest &value){};
|
||||
#endif
|
||||
|
||||
protected:
|
||||
void read_message(uint32_t msg_size, uint32_t msg_type, const uint8_t *msg_data) override;
|
||||
};
|
||||
|
||||
class APIServerConnection : public APIServerConnectionBase {
|
||||
public:
|
||||
virtual bool send_hello_response(const HelloRequest &msg) = 0;
|
||||
virtual bool send_disconnect_response(const DisconnectRequest &msg) = 0;
|
||||
virtual bool send_ping_response(const PingRequest &msg) = 0;
|
||||
virtual bool send_device_info_response(const DeviceInfoRequest &msg) = 0;
|
||||
virtual void list_entities(const ListEntitiesRequest &msg) = 0;
|
||||
virtual void subscribe_states(const SubscribeStatesRequest &msg) = 0;
|
||||
virtual void subscribe_logs(const SubscribeLogsRequest &msg) = 0;
|
||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
||||
virtual void subscribe_homeassistant_services(const SubscribeHomeassistantServicesRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
||||
virtual void subscribe_home_assistant_states(const SubscribeHomeAssistantStatesRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_API_USER_DEFINED_ACTIONS
|
||||
virtual void execute_service(const ExecuteServiceRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_API_NOISE
|
||||
virtual bool send_noise_encryption_set_key_response(const NoiseEncryptionSetKeyRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BUTTON
|
||||
virtual void button_command(const ButtonCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_CAMERA
|
||||
virtual void camera_image(const CameraImageRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_CLIMATE
|
||||
virtual void climate_command(const ClimateCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_COVER
|
||||
virtual void cover_command(const CoverCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATE
|
||||
virtual void date_command(const DateCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATETIME
|
||||
virtual void datetime_command(const DateTimeCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_FAN
|
||||
virtual void fan_command(const FanCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_LIGHT
|
||||
virtual void light_command(const LightCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_LOCK
|
||||
virtual void lock_command(const LockCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_MEDIA_PLAYER
|
||||
virtual void media_player_command(const MediaPlayerCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_NUMBER
|
||||
virtual void number_command(const NumberCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_SELECT
|
||||
virtual void select_command(const SelectCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_SIREN
|
||||
virtual void siren_command(const SirenCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_SWITCH
|
||||
virtual void switch_command(const SwitchCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_TEXT
|
||||
virtual void text_command(const TextCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_TIME
|
||||
virtual void time_command(const TimeCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_UPDATE
|
||||
virtual void update_command(const UpdateCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_VALVE
|
||||
virtual void valve_command(const ValveCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void subscribe_bluetooth_le_advertisements(const SubscribeBluetoothLEAdvertisementsRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_device_request(const BluetoothDeviceRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_gatt_get_services(const BluetoothGATTGetServicesRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_gatt_read(const BluetoothGATTReadRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_gatt_write(const BluetoothGATTWriteRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_gatt_read_descriptor(const BluetoothGATTReadDescriptorRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_gatt_write_descriptor(const BluetoothGATTWriteDescriptorRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_gatt_notify(const BluetoothGATTNotifyRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual bool send_subscribe_bluetooth_connections_free_response(
|
||||
const SubscribeBluetoothConnectionsFreeRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void unsubscribe_bluetooth_le_advertisements(const UnsubscribeBluetoothLEAdvertisementsRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
virtual void bluetooth_scanner_set_mode(const BluetoothScannerSetModeRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
virtual void subscribe_voice_assistant(const SubscribeVoiceAssistantRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
virtual bool send_voice_assistant_get_configuration_response(const VoiceAssistantConfigurationRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
virtual void voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_ALARM_CONTROL_PANEL
|
||||
virtual void alarm_control_panel_command(const AlarmControlPanelCommandRequest &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
virtual void zwave_proxy_frame(const ZWaveProxyFrame &msg) = 0;
|
||||
#endif
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
virtual void zwave_proxy_request(const ZWaveProxyRequest &msg) = 0;
|
||||
#endif
|
||||
protected:
|
||||
void on_hello_request(const HelloRequest &msg) override;
|
||||
void on_disconnect_request(const DisconnectRequest &msg) override;
|
||||
void on_ping_request(const PingRequest &msg) override;
|
||||
void on_device_info_request(const DeviceInfoRequest &msg) override;
|
||||
void on_list_entities_request(const ListEntitiesRequest &msg) override;
|
||||
void on_subscribe_states_request(const SubscribeStatesRequest &msg) override;
|
||||
void on_subscribe_logs_request(const SubscribeLogsRequest &msg) override;
|
||||
#ifdef USE_API_HOMEASSISTANT_SERVICES
|
||||
void on_subscribe_homeassistant_services_request(const SubscribeHomeassistantServicesRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_API_HOMEASSISTANT_STATES
|
||||
void on_subscribe_home_assistant_states_request(const SubscribeHomeAssistantStatesRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_API_USER_DEFINED_ACTIONS
|
||||
void on_execute_service_request(const ExecuteServiceRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_API_NOISE
|
||||
void on_noise_encryption_set_key_request(const NoiseEncryptionSetKeyRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BUTTON
|
||||
void on_button_command_request(const ButtonCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_CAMERA
|
||||
void on_camera_image_request(const CameraImageRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_CLIMATE
|
||||
void on_climate_command_request(const ClimateCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_COVER
|
||||
void on_cover_command_request(const CoverCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATE
|
||||
void on_date_command_request(const DateCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_DATETIME
|
||||
void on_date_time_command_request(const DateTimeCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_FAN
|
||||
void on_fan_command_request(const FanCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_LIGHT
|
||||
void on_light_command_request(const LightCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_LOCK
|
||||
void on_lock_command_request(const LockCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_MEDIA_PLAYER
|
||||
void on_media_player_command_request(const MediaPlayerCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_NUMBER
|
||||
void on_number_command_request(const NumberCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_SELECT
|
||||
void on_select_command_request(const SelectCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_SIREN
|
||||
void on_siren_command_request(const SirenCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_SWITCH
|
||||
void on_switch_command_request(const SwitchCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_TEXT
|
||||
void on_text_command_request(const TextCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_DATETIME_TIME
|
||||
void on_time_command_request(const TimeCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_UPDATE
|
||||
void on_update_command_request(const UpdateCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_VALVE
|
||||
void on_valve_command_request(const ValveCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_subscribe_bluetooth_le_advertisements_request(const SubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_device_request(const BluetoothDeviceRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_gatt_get_services_request(const BluetoothGATTGetServicesRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_gatt_read_request(const BluetoothGATTReadRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_gatt_write_request(const BluetoothGATTWriteRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_gatt_read_descriptor_request(const BluetoothGATTReadDescriptorRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_gatt_write_descriptor_request(const BluetoothGATTWriteDescriptorRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_gatt_notify_request(const BluetoothGATTNotifyRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_subscribe_bluetooth_connections_free_request(const SubscribeBluetoothConnectionsFreeRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_unsubscribe_bluetooth_le_advertisements_request(
|
||||
const UnsubscribeBluetoothLEAdvertisementsRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_BLUETOOTH_PROXY
|
||||
void on_bluetooth_scanner_set_mode_request(const BluetoothScannerSetModeRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
void on_subscribe_voice_assistant_request(const SubscribeVoiceAssistantRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
void on_voice_assistant_configuration_request(const VoiceAssistantConfigurationRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_VOICE_ASSISTANT
|
||||
void on_voice_assistant_set_configuration(const VoiceAssistantSetConfiguration &msg) override;
|
||||
#endif
|
||||
#ifdef USE_ALARM_CONTROL_PANEL
|
||||
void on_alarm_control_panel_command_request(const AlarmControlPanelCommandRequest &msg) override;
|
||||
#endif
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
void on_z_wave_proxy_frame(const ZWaveProxyFrame &msg) override;
|
||||
#endif
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
void on_z_wave_proxy_request(const ZWaveProxyRequest &msg) override;
|
||||
#endif
|
||||
void read_message(uint32_t msg_size, uint32_t msg_type, const uint8_t *msg_data) override;
|
||||
};
|
||||
|
||||
} // namespace esphome::api
|
||||
|
||||
@@ -192,15 +192,11 @@ void APIServer::loop() {
|
||||
ESP_LOGV(TAG, "Remove connection %s", client->get_name());
|
||||
|
||||
#ifdef USE_API_CLIENT_DISCONNECTED_TRIGGER
|
||||
// Save client info before closing socket and removal for the trigger
|
||||
char peername_buf[socket::SOCKADDR_STR_LEN];
|
||||
// Save client info before removal for the trigger
|
||||
std::string client_name(client->get_name());
|
||||
std::string client_peername(client->get_peername_to(peername_buf));
|
||||
std::string client_peername(client->get_peername());
|
||||
#endif
|
||||
|
||||
// Close socket now (was deferred from on_fatal_error to allow getpeername)
|
||||
client->helper_->close();
|
||||
|
||||
// Swap with the last element and pop (avoids expensive vector shifts)
|
||||
if (client_index < this->clients_.size() - 1) {
|
||||
std::swap(this->clients_[client_index], this->clients_.back());
|
||||
@@ -215,7 +211,7 @@ void APIServer::loop() {
|
||||
|
||||
#ifdef USE_API_CLIENT_DISCONNECTED_TRIGGER
|
||||
// Fire trigger after client is removed so api.connected reflects the true state
|
||||
this->client_disconnected_trigger_.trigger(client_name, client_peername);
|
||||
this->client_disconnected_trigger_->trigger(client_name, client_peername);
|
||||
#endif
|
||||
// Don't increment client_index since we need to process the swapped element
|
||||
}
|
||||
@@ -245,10 +241,8 @@ void APIServer::handle_disconnect(APIConnection *conn) {}
|
||||
void APIServer::on_##entity_name##_update(entity_type *obj) { /* NOLINT(bugprone-macro-parentheses) */ \
|
||||
if (obj->is_internal()) \
|
||||
return; \
|
||||
for (auto &c : this->clients_) { \
|
||||
if (c->flags_.state_subscription) \
|
||||
c->send_##entity_name##_state(obj); \
|
||||
} \
|
||||
for (auto &c : this->clients_) \
|
||||
c->send_##entity_name##_state(obj); \
|
||||
}
|
||||
|
||||
#ifdef USE_BINARY_SENSOR
|
||||
@@ -324,13 +318,13 @@ API_DISPATCH_UPDATE(water_heater::WaterHeater, water_heater)
|
||||
#endif
|
||||
|
||||
#ifdef USE_EVENT
|
||||
// Event is a special case - unlike other entities with simple state fields,
|
||||
// events store their state in a member accessed via obj->get_last_event_type()
|
||||
void APIServer::on_event(event::Event *obj) {
|
||||
if (obj->is_internal())
|
||||
return;
|
||||
for (auto &c : this->clients_) {
|
||||
if (c->flags_.state_subscription)
|
||||
c->send_event(obj);
|
||||
}
|
||||
for (auto &c : this->clients_)
|
||||
c->send_event(obj, obj->get_last_event_type());
|
||||
}
|
||||
#endif
|
||||
|
||||
@@ -339,10 +333,8 @@ void APIServer::on_event(event::Event *obj) {
|
||||
void APIServer::on_update(update::UpdateEntity *obj) {
|
||||
if (obj->is_internal())
|
||||
return;
|
||||
for (auto &c : this->clients_) {
|
||||
if (c->flags_.state_subscription)
|
||||
c->send_update_state(obj);
|
||||
}
|
||||
for (auto &c : this->clients_)
|
||||
c->send_update_state(obj);
|
||||
}
|
||||
#endif
|
||||
|
||||
@@ -355,21 +347,6 @@ void APIServer::on_zwave_proxy_request(const esphome::api::ProtoMessage &msg) {
|
||||
}
|
||||
#endif
|
||||
|
||||
#ifdef USE_IR_RF
|
||||
void APIServer::send_infrared_rf_receive_event([[maybe_unused]] uint32_t device_id, uint32_t key,
|
||||
const std::vector<int32_t> *timings) {
|
||||
InfraredRFReceiveEvent resp{};
|
||||
#ifdef USE_DEVICES
|
||||
resp.device_id = device_id;
|
||||
#endif
|
||||
resp.key = key;
|
||||
resp.timings = timings;
|
||||
|
||||
for (auto &c : this->clients_)
|
||||
c->send_infrared_rf_receive_event(resp);
|
||||
}
|
||||
#endif
|
||||
|
||||
#ifdef USE_ALARM_CONTROL_PANEL
|
||||
API_DISPATCH_UPDATE(alarm_control_panel::AlarmControlPanel, alarm_control_panel)
|
||||
#endif
|
||||
@@ -562,10 +539,8 @@ bool APIServer::clear_noise_psk(bool make_active) {
|
||||
#ifdef USE_HOMEASSISTANT_TIME
|
||||
void APIServer::request_time() {
|
||||
for (auto &client : this->clients_) {
|
||||
if (!client->flags_.remove && client->is_authenticated()) {
|
||||
if (!client->flags_.remove && client->is_authenticated())
|
||||
client->send_time_request();
|
||||
return; // Only request from one client to avoid clock conflicts
|
||||
}
|
||||
}
|
||||
}
|
||||
#endif
|
||||
@@ -625,7 +600,8 @@ void APIServer::on_shutdown() {
|
||||
if (!c->send_message(req, DisconnectRequest::MESSAGE_TYPE)) {
|
||||
// If we can't send the disconnect request directly (tx_buffer full),
|
||||
// schedule it at the front of the batch so it will be sent with priority
|
||||
c->schedule_message_front_(nullptr, DisconnectRequest::MESSAGE_TYPE, DisconnectRequest::ESTIMATED_SIZE);
|
||||
c->schedule_message_front_(nullptr, &APIConnection::try_send_disconnect_request, DisconnectRequest::MESSAGE_TYPE,
|
||||
DisconnectRequest::ESTIMATED_SIZE);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -657,18 +633,18 @@ uint32_t APIServer::register_active_action_call(uint32_t client_call_id, APIConn
|
||||
this->active_action_calls_.push_back({action_call_id, client_call_id, conn});
|
||||
|
||||
// Schedule automatic cleanup after timeout (client will have given up by then)
|
||||
// Uses numeric ID overload to avoid heap allocation from str_sprintf
|
||||
this->set_timeout(action_call_id, USE_API_ACTION_CALL_TIMEOUT_MS, [this, action_call_id]() {
|
||||
ESP_LOGD(TAG, "Action call %u timed out", action_call_id);
|
||||
this->unregister_active_action_call(action_call_id);
|
||||
});
|
||||
this->set_timeout(str_sprintf("action_call_%u", action_call_id), USE_API_ACTION_CALL_TIMEOUT_MS,
|
||||
[this, action_call_id]() {
|
||||
ESP_LOGD(TAG, "Action call %u timed out", action_call_id);
|
||||
this->unregister_active_action_call(action_call_id);
|
||||
});
|
||||
|
||||
return action_call_id;
|
||||
}
|
||||
|
||||
void APIServer::unregister_active_action_call(uint32_t action_call_id) {
|
||||
// Cancel the timeout for this action call (uses numeric ID overload)
|
||||
this->cancel_timeout(action_call_id);
|
||||
// Cancel the timeout for this action call
|
||||
this->cancel_timeout(str_sprintf("action_call_%u", action_call_id));
|
||||
|
||||
// Swap-and-pop is more efficient than remove_if for unordered vectors
|
||||
for (size_t i = 0; i < this->active_action_calls_.size(); i++) {
|
||||
@@ -684,8 +660,8 @@ void APIServer::unregister_active_action_calls_for_connection(APIConnection *con
|
||||
// Remove all active action calls for disconnected connection using swap-and-pop
|
||||
for (size_t i = 0; i < this->active_action_calls_.size();) {
|
||||
if (this->active_action_calls_[i].connection == conn) {
|
||||
// Cancel the timeout for this action call (uses numeric ID overload)
|
||||
this->cancel_timeout(this->active_action_calls_[i].action_call_id);
|
||||
// Cancel the timeout for this action call
|
||||
this->cancel_timeout(str_sprintf("action_call_%u", this->active_action_calls_[i].action_call_id));
|
||||
|
||||
std::swap(this->active_action_calls_[i], this->active_action_calls_.back());
|
||||
this->active_action_calls_.pop_back();
|
||||
|
||||
@@ -185,9 +185,6 @@ class APIServer : public Component,
|
||||
#ifdef USE_ZWAVE_PROXY
|
||||
void on_zwave_proxy_request(const esphome::api::ProtoMessage &msg);
|
||||
#endif
|
||||
#ifdef USE_IR_RF
|
||||
void send_infrared_rf_receive_event(uint32_t device_id, uint32_t key, const std::vector<int32_t> *timings);
|
||||
#endif
|
||||
|
||||
bool is_connected(bool state_subscription_only = false) const;
|
||||
|
||||
@@ -227,10 +224,12 @@ class APIServer : public Component,
|
||||
#endif
|
||||
|
||||
#ifdef USE_API_CLIENT_CONNECTED_TRIGGER
|
||||
Trigger<std::string, std::string> *get_client_connected_trigger() { return &this->client_connected_trigger_; }
|
||||
Trigger<std::string, std::string> *get_client_connected_trigger() const { return this->client_connected_trigger_; }
|
||||
#endif
|
||||
#ifdef USE_API_CLIENT_DISCONNECTED_TRIGGER
|
||||
Trigger<std::string, std::string> *get_client_disconnected_trigger() { return &this->client_disconnected_trigger_; }
|
||||
Trigger<std::string, std::string> *get_client_disconnected_trigger() const {
|
||||
return this->client_disconnected_trigger_;
|
||||
}
|
||||
#endif
|
||||
|
||||
protected:
|
||||
@@ -251,10 +250,10 @@ class APIServer : public Component,
|
||||
// Pointers and pointer-like types first (4 bytes each)
|
||||
std::unique_ptr<socket::Socket> socket_ = nullptr;
|
||||
#ifdef USE_API_CLIENT_CONNECTED_TRIGGER
|
||||
Trigger<std::string, std::string> client_connected_trigger_;
|
||||
Trigger<std::string, std::string> *client_connected_trigger_ = new Trigger<std::string, std::string>();
|
||||
#endif
|
||||
#ifdef USE_API_CLIENT_DISCONNECTED_TRIGGER
|
||||
Trigger<std::string, std::string> client_disconnected_trigger_;
|
||||
Trigger<std::string, std::string> *client_disconnected_trigger_ = new Trigger<std::string, std::string>();
|
||||
#endif
|
||||
|
||||
// 4-byte aligned types
|
||||
|
||||
@@ -265,7 +265,7 @@ class CustomAPIDevice {
|
||||
for (auto &it : data) {
|
||||
auto &kv = resp.data.emplace_back();
|
||||
kv.key = StringRef(it.first);
|
||||
kv.value = StringRef(it.second); // data map lives until send completes
|
||||
kv.value = it.second; // value is std::string (no_zero_copy), assign directly
|
||||
}
|
||||
global_api_server->send_homeassistant_action(resp);
|
||||
}
|
||||
@@ -308,7 +308,7 @@ class CustomAPIDevice {
|
||||
for (auto &it : data) {
|
||||
auto &kv = resp.data.emplace_back();
|
||||
kv.key = StringRef(it.first);
|
||||
kv.value = StringRef(it.second); // data map lives until send completes
|
||||
kv.value = it.second; // value is std::string (no_zero_copy), assign directly
|
||||
}
|
||||
global_api_server->send_homeassistant_action(resp);
|
||||
}
|
||||
|
||||
@@ -25,9 +25,7 @@ template<typename... X> class TemplatableStringValue : public TemplatableValue<s
|
||||
|
||||
private:
|
||||
// Helper to convert value to string - handles the case where value is already a string
|
||||
template<typename T> static std::string value_to_string(T &&val) {
|
||||
return to_string(std::forward<T>(val)); // NOLINT
|
||||
}
|
||||
template<typename T> static std::string value_to_string(T &&val) { return to_string(std::forward<T>(val)); }
|
||||
|
||||
// Overloads for string types - needed because std::to_string doesn't support them
|
||||
static std::string value_to_string(char *val) {
|
||||
@@ -138,10 +136,12 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
||||
void set_wants_response() { this->flags_.wants_response = true; }
|
||||
|
||||
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
|
||||
Trigger<JsonObjectConst, Ts...> *get_success_trigger_with_response() { return &this->success_trigger_with_response_; }
|
||||
Trigger<JsonObjectConst, Ts...> *get_success_trigger_with_response() const {
|
||||
return this->success_trigger_with_response_;
|
||||
}
|
||||
#endif
|
||||
Trigger<Ts...> *get_success_trigger() { return &this->success_trigger_; }
|
||||
Trigger<std::string, Ts...> *get_error_trigger() { return &this->error_trigger_; }
|
||||
Trigger<Ts...> *get_success_trigger() const { return this->success_trigger_; }
|
||||
Trigger<std::string, Ts...> *get_error_trigger() const { return this->error_trigger_; }
|
||||
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
|
||||
|
||||
void play(const Ts &...x) override {
|
||||
@@ -149,21 +149,11 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
||||
std::string service_value = this->service_.value(x...);
|
||||
resp.service = StringRef(service_value);
|
||||
resp.is_event = this->flags_.is_event;
|
||||
|
||||
// Local storage for lambda-evaluated strings - lives until after send
|
||||
FixedVector<std::string> data_storage;
|
||||
FixedVector<std::string> data_template_storage;
|
||||
FixedVector<std::string> variables_storage;
|
||||
|
||||
this->populate_service_map(resp.data, this->data_, data_storage, x...);
|
||||
this->populate_service_map(resp.data_template, this->data_template_, data_template_storage, x...);
|
||||
this->populate_service_map(resp.variables, this->variables_, variables_storage, x...);
|
||||
this->populate_service_map(resp.data, this->data_, x...);
|
||||
this->populate_service_map(resp.data_template, this->data_template_, x...);
|
||||
this->populate_service_map(resp.variables, this->variables_, x...);
|
||||
|
||||
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
|
||||
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
|
||||
// IMPORTANT: Declare at outer scope so it lives until send_homeassistant_action returns.
|
||||
std::string response_template_value;
|
||||
#endif
|
||||
if (this->flags_.wants_status) {
|
||||
// Generate a unique call ID for this service call
|
||||
static uint32_t call_id_counter = 1;
|
||||
@@ -174,8 +164,8 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
||||
resp.wants_response = true;
|
||||
// Set response template if provided
|
||||
if (this->flags_.has_response_template) {
|
||||
response_template_value = this->response_template_.value(x...);
|
||||
resp.response_template = StringRef(response_template_value);
|
||||
std::string response_template_value = this->response_template_.value(x...);
|
||||
resp.response_template = response_template_value;
|
||||
}
|
||||
}
|
||||
#endif
|
||||
@@ -187,14 +177,14 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
||||
if (response.is_success()) {
|
||||
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
|
||||
if (this->flags_.wants_response) {
|
||||
this->success_trigger_with_response_.trigger(response.get_json(), args...);
|
||||
this->success_trigger_with_response_->trigger(response.get_json(), args...);
|
||||
} else
|
||||
#endif
|
||||
{
|
||||
this->success_trigger_.trigger(args...);
|
||||
this->success_trigger_->trigger(args...);
|
||||
}
|
||||
} else {
|
||||
this->error_trigger_.trigger(response.get_error_message(), args...);
|
||||
this->error_trigger_->trigger(response.get_error_message(), args...);
|
||||
}
|
||||
},
|
||||
captured_args);
|
||||
@@ -215,31 +205,12 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
||||
}
|
||||
|
||||
template<typename VectorType, typename SourceType>
|
||||
static void populate_service_map(VectorType &dest, SourceType &source, FixedVector<std::string> &value_storage,
|
||||
Ts... x) {
|
||||
static void populate_service_map(VectorType &dest, SourceType &source, Ts... x) {
|
||||
dest.init(source.size());
|
||||
|
||||
// Count non-static strings to allocate exact storage needed
|
||||
size_t lambda_count = 0;
|
||||
for (const auto &it : source) {
|
||||
if (!it.value.is_static_string()) {
|
||||
lambda_count++;
|
||||
}
|
||||
}
|
||||
value_storage.init(lambda_count);
|
||||
|
||||
for (auto &it : source) {
|
||||
auto &kv = dest.emplace_back();
|
||||
kv.key = StringRef(it.key);
|
||||
|
||||
if (it.value.is_static_string()) {
|
||||
// Static string from YAML - zero allocation
|
||||
kv.value = StringRef(it.value.get_static_string());
|
||||
} else {
|
||||
// Lambda evaluation - store result, reference it
|
||||
value_storage.push_back(it.value.value(x...));
|
||||
kv.value = StringRef(value_storage.back());
|
||||
}
|
||||
kv.value = it.value.value(x...);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -251,10 +222,10 @@ template<typename... Ts> class HomeAssistantServiceCallAction : public Action<Ts
|
||||
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES
|
||||
#ifdef USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
|
||||
TemplatableStringValue<Ts...> response_template_{""};
|
||||
Trigger<JsonObjectConst, Ts...> success_trigger_with_response_;
|
||||
Trigger<JsonObjectConst, Ts...> *success_trigger_with_response_ = new Trigger<JsonObjectConst, Ts...>();
|
||||
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES_JSON
|
||||
Trigger<Ts...> success_trigger_;
|
||||
Trigger<std::string, Ts...> error_trigger_;
|
||||
Trigger<Ts...> *success_trigger_ = new Trigger<Ts...>();
|
||||
Trigger<std::string, Ts...> *error_trigger_ = new Trigger<std::string, Ts...>();
|
||||
#endif // USE_API_HOMEASSISTANT_ACTION_RESPONSES
|
||||
|
||||
struct Flags {
|
||||
|
||||
@@ -76,9 +76,6 @@ LIST_ENTITIES_HANDLER(alarm_control_panel, alarm_control_panel::AlarmControlPane
|
||||
#ifdef USE_WATER_HEATER
|
||||
LIST_ENTITIES_HANDLER(water_heater, water_heater::WaterHeater, ListEntitiesWaterHeaterResponse)
|
||||
#endif
|
||||
#ifdef USE_INFRARED
|
||||
LIST_ENTITIES_HANDLER(infrared, infrared::Infrared, ListEntitiesInfraredResponse)
|
||||
#endif
|
||||
#ifdef USE_EVENT
|
||||
LIST_ENTITIES_HANDLER(event, event::Event, ListEntitiesEventResponse)
|
||||
#endif
|
||||
|
||||
@@ -9,10 +9,11 @@ namespace esphome::api {
|
||||
class APIConnection;
|
||||
|
||||
// Macro for generating ListEntitiesIterator handlers
|
||||
// Calls schedule_message_ which dispatches to try_send_*_info
|
||||
// Calls schedule_message_ with try_send_*_info
|
||||
#define LIST_ENTITIES_HANDLER(entity_type, EntityClass, ResponseType) \
|
||||
bool ListEntitiesIterator::on_##entity_type(EntityClass *entity) { /* NOLINT(bugprone-macro-parentheses) */ \
|
||||
return this->client_->schedule_message_(entity, ResponseType::MESSAGE_TYPE, ResponseType::ESTIMATED_SIZE); \
|
||||
return this->client_->schedule_message_(entity, &APIConnection::try_send_##entity_type##_info, \
|
||||
ResponseType::MESSAGE_TYPE, ResponseType::ESTIMATED_SIZE); \
|
||||
}
|
||||
|
||||
class ListEntitiesIterator : public ComponentIterator {
|
||||
@@ -84,9 +85,6 @@ class ListEntitiesIterator : public ComponentIterator {
|
||||
#ifdef USE_WATER_HEATER
|
||||
bool on_water_heater(water_heater::WaterHeater *entity) override;
|
||||
#endif
|
||||
#ifdef USE_INFRARED
|
||||
bool on_infrared(infrared::Infrared *entity) override;
|
||||
#endif
|
||||
#ifdef USE_EVENT
|
||||
bool on_event(event::Event *entity) override;
|
||||
#endif
|
||||
|
||||
@@ -48,14 +48,14 @@ uint32_t ProtoDecodableMessage::count_repeated_field(const uint8_t *buffer, size
|
||||
}
|
||||
uint32_t field_length = res->as_uint32();
|
||||
ptr += consumed;
|
||||
if (field_length > static_cast<size_t>(end - ptr)) {
|
||||
if (ptr + field_length > end) {
|
||||
return count; // Out of bounds
|
||||
}
|
||||
ptr += field_length;
|
||||
break;
|
||||
}
|
||||
case WIRE_TYPE_FIXED32: { // 32-bit - skip 4 bytes
|
||||
if (end - ptr < 4) {
|
||||
if (ptr + 4 > end) {
|
||||
return count;
|
||||
}
|
||||
ptr += 4;
|
||||
@@ -110,7 +110,7 @@ void ProtoDecodableMessage::decode(const uint8_t *buffer, size_t length) {
|
||||
}
|
||||
uint32_t field_length = res->as_uint32();
|
||||
ptr += consumed;
|
||||
if (field_length > static_cast<size_t>(end - ptr)) {
|
||||
if (ptr + field_length > end) {
|
||||
ESP_LOGV(TAG, "Out-of-bounds Length Delimited at offset %ld", (long) (ptr - buffer));
|
||||
return;
|
||||
}
|
||||
@@ -121,7 +121,7 @@ void ProtoDecodableMessage::decode(const uint8_t *buffer, size_t length) {
|
||||
break;
|
||||
}
|
||||
case WIRE_TYPE_FIXED32: { // 32-bit
|
||||
if (end - ptr < 4) {
|
||||
if (ptr + 4 > end) {
|
||||
ESP_LOGV(TAG, "Out-of-bounds Fixed32-bit at offset %ld", (long) (ptr - buffer));
|
||||
return;
|
||||
}
|
||||
@@ -139,4 +139,12 @@ void ProtoDecodableMessage::decode(const uint8_t *buffer, size_t length) {
|
||||
}
|
||||
}
|
||||
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
std::string ProtoMessage::dump() const {
|
||||
std::string out;
|
||||
this->dump_to(out);
|
||||
return out;
|
||||
}
|
||||
#endif
|
||||
|
||||
} // namespace esphome::api
|
||||
|
||||
@@ -112,12 +112,8 @@ class ProtoVarInt {
|
||||
uint64_t result = buffer[0] & 0x7F;
|
||||
uint8_t bitpos = 7;
|
||||
|
||||
// A 64-bit varint is at most 10 bytes (ceil(64/7)). Reject overlong encodings
|
||||
// to avoid undefined behavior from shifting uint64_t by >= 64 bits.
|
||||
uint32_t max_len = std::min(len, uint32_t(10));
|
||||
|
||||
// Start from the second byte since we've already processed the first
|
||||
for (uint32_t i = 1; i < max_len; i++) {
|
||||
for (uint32_t i = 1; i < len; i++) {
|
||||
uint8_t val = buffer[i];
|
||||
result |= uint64_t(val & 0x7F) << uint64_t(bitpos);
|
||||
bitpos += 7;
|
||||
@@ -366,77 +362,6 @@ class ProtoWriteBuffer {
|
||||
std::vector<uint8_t> *buffer_;
|
||||
};
|
||||
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
/**
|
||||
* Fixed-size buffer for message dumps - avoids heap allocation.
|
||||
* Sized to match the logger's default tx_buffer_size (512 bytes)
|
||||
* since anything larger gets truncated anyway.
|
||||
*/
|
||||
class DumpBuffer {
|
||||
public:
|
||||
// Matches default tx_buffer_size in logger component
|
||||
static constexpr size_t CAPACITY = 512;
|
||||
|
||||
DumpBuffer() : pos_(0) { buf_[0] = '\0'; }
|
||||
|
||||
DumpBuffer &append(const char *str) {
|
||||
if (str) {
|
||||
append_impl_(str, strlen(str));
|
||||
}
|
||||
return *this;
|
||||
}
|
||||
|
||||
DumpBuffer &append(const char *str, size_t len) {
|
||||
append_impl_(str, len);
|
||||
return *this;
|
||||
}
|
||||
|
||||
DumpBuffer &append(size_t n, char c) {
|
||||
size_t space = CAPACITY - 1 - pos_;
|
||||
if (n > space)
|
||||
n = space;
|
||||
if (n > 0) {
|
||||
memset(buf_ + pos_, c, n);
|
||||
pos_ += n;
|
||||
buf_[pos_] = '\0';
|
||||
}
|
||||
return *this;
|
||||
}
|
||||
|
||||
const char *c_str() const { return buf_; }
|
||||
size_t size() const { return pos_; }
|
||||
|
||||
/// Get writable buffer pointer for use with buf_append_printf
|
||||
char *data() { return buf_; }
|
||||
/// Get current position for use with buf_append_printf
|
||||
size_t pos() const { return pos_; }
|
||||
/// Update position after buf_append_printf call
|
||||
void set_pos(size_t pos) {
|
||||
if (pos >= CAPACITY) {
|
||||
pos_ = CAPACITY - 1;
|
||||
} else {
|
||||
pos_ = pos;
|
||||
}
|
||||
buf_[pos_] = '\0';
|
||||
}
|
||||
|
||||
private:
|
||||
void append_impl_(const char *str, size_t len) {
|
||||
size_t space = CAPACITY - 1 - pos_;
|
||||
if (len > space)
|
||||
len = space;
|
||||
if (len > 0) {
|
||||
memcpy(buf_ + pos_, str, len);
|
||||
pos_ += len;
|
||||
buf_[pos_] = '\0';
|
||||
}
|
||||
}
|
||||
|
||||
char buf_[CAPACITY];
|
||||
size_t pos_;
|
||||
};
|
||||
#endif
|
||||
|
||||
class ProtoMessage {
|
||||
public:
|
||||
virtual ~ProtoMessage() = default;
|
||||
@@ -445,7 +370,8 @@ class ProtoMessage {
|
||||
// Default implementation for messages with no fields
|
||||
virtual void calculate_size(ProtoSize &size) const {}
|
||||
#ifdef HAS_PROTO_MESSAGE_DUMP
|
||||
virtual const char *dump_to(DumpBuffer &out) const = 0;
|
||||
std::string dump() const;
|
||||
virtual void dump_to(std::string &out) const = 0;
|
||||
virtual const char *message_name() const { return "unknown"; }
|
||||
#endif
|
||||
};
|
||||
@@ -961,16 +887,32 @@ class ProtoService {
|
||||
virtual bool is_connection_setup() = 0;
|
||||
virtual void on_fatal_error() = 0;
|
||||
virtual void on_no_setup_connection() = 0;
|
||||
/**
|
||||
* Create a buffer with a reserved size.
|
||||
* @param reserve_size The number of bytes to pre-allocate in the buffer. This is a hint
|
||||
* to optimize memory usage and avoid reallocations during encoding.
|
||||
* Implementations should aim to allocate at least this size.
|
||||
* @return A ProtoWriteBuffer object with the reserved size.
|
||||
*/
|
||||
virtual ProtoWriteBuffer create_buffer(uint32_t reserve_size) = 0;
|
||||
virtual bool send_buffer(ProtoWriteBuffer buffer, uint8_t message_type) = 0;
|
||||
virtual void read_message(uint32_t msg_size, uint32_t msg_type, const uint8_t *msg_data) = 0;
|
||||
/**
|
||||
* Send a protobuf message by calculating its size, allocating a buffer, encoding, and sending.
|
||||
* This is the implementation method - callers should use send_message() which adds logging.
|
||||
* @param msg The protobuf message to send.
|
||||
* @param message_type The message type identifier.
|
||||
* @return True if the message was sent successfully, false otherwise.
|
||||
*/
|
||||
virtual bool send_message_impl(const ProtoMessage &msg, uint8_t message_type) = 0;
|
||||
|
||||
// Optimized method that pre-allocates buffer based on message size
|
||||
bool send_message_(const ProtoMessage &msg, uint8_t message_type) {
|
||||
ProtoSize size;
|
||||
msg.calculate_size(size);
|
||||
uint32_t msg_size = size.get_size();
|
||||
|
||||
// Create a pre-sized buffer
|
||||
auto buffer = this->create_buffer(msg_size);
|
||||
|
||||
// Encode message into the buffer
|
||||
msg.encode(buffer);
|
||||
|
||||
// Send the buffer
|
||||
return this->send_buffer(buffer, message_type);
|
||||
}
|
||||
|
||||
// Authentication helper methods
|
||||
inline bool check_connection_setup_() {
|
||||
|
||||
@@ -79,9 +79,6 @@ class InitialStateIterator : public ComponentIterator {
|
||||
#ifdef USE_WATER_HEATER
|
||||
bool on_water_heater(water_heater::WaterHeater *entity) override;
|
||||
#endif
|
||||
#ifdef USE_INFRARED
|
||||
bool on_infrared(infrared::Infrared *infrared) override { return true; };
|
||||
#endif
|
||||
#ifdef USE_EVENT
|
||||
bool on_event(event::Event *event) override { return true; };
|
||||
#endif
|
||||
|
||||
@@ -264,9 +264,9 @@ template<typename... Ts> class APIRespondAction : public Action<Ts...> {
|
||||
// Build and send JSON response
|
||||
json::JsonBuilder builder;
|
||||
this->json_builder_(x..., builder.root());
|
||||
auto json_buf = builder.serialize();
|
||||
std::string json_str = builder.serialize();
|
||||
this->parent_->send_action_response(call_id, success, StringRef(error_message),
|
||||
reinterpret_cast<const uint8_t *>(json_buf.data()), json_buf.size());
|
||||
reinterpret_cast<const uint8_t *>(json_str.data()), json_str.size());
|
||||
return;
|
||||
}
|
||||
#endif
|
||||
|
||||
@@ -6,7 +6,7 @@ namespace esphome::aqi {
|
||||
|
||||
class AbstractAQICalculator {
|
||||
public:
|
||||
virtual uint16_t get_aqi(float pm2_5_value, float pm10_0_value) = 0;
|
||||
virtual uint16_t get_aqi(uint16_t pm2_5_value, uint16_t pm10_0_value) = 0;
|
||||
};
|
||||
|
||||
} // namespace esphome::aqi
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
#pragma once
|
||||
|
||||
#include <cmath>
|
||||
#include <limits>
|
||||
#include <climits>
|
||||
#include "abstract_aqi_calculator.h"
|
||||
|
||||
// https://document.airnow.gov/technical-assistance-document-for-the-reporting-of-daily-air-quailty.pdf
|
||||
@@ -10,11 +9,11 @@ namespace esphome::aqi {
|
||||
|
||||
class AQICalculator : public AbstractAQICalculator {
|
||||
public:
|
||||
uint16_t get_aqi(float pm2_5_value, float pm10_0_value) override {
|
||||
float pm2_5_index = calculate_index(pm2_5_value, PM2_5_GRID);
|
||||
float pm10_0_index = calculate_index(pm10_0_value, PM10_0_GRID);
|
||||
uint16_t get_aqi(uint16_t pm2_5_value, uint16_t pm10_0_value) override {
|
||||
int pm2_5_index = calculate_index(pm2_5_value, PM2_5_GRID);
|
||||
int pm10_0_index = calculate_index(pm10_0_value, PM10_0_GRID);
|
||||
|
||||
return static_cast<uint16_t>(std::round((pm2_5_index < pm10_0_index) ? pm10_0_index : pm2_5_index));
|
||||
return (pm2_5_index < pm10_0_index) ? pm10_0_index : pm2_5_index;
|
||||
}
|
||||
|
||||
protected:
|
||||
@@ -22,28 +21,25 @@ class AQICalculator : public AbstractAQICalculator {
|
||||
|
||||
static constexpr int INDEX_GRID[NUM_LEVELS][2] = {{0, 50}, {51, 100}, {101, 150}, {151, 200}, {201, 300}, {301, 500}};
|
||||
|
||||
static constexpr float PM2_5_GRID[NUM_LEVELS][2] = {{0.0f, 9.0f}, {9.1f, 35.4f},
|
||||
{35.5f, 55.4f}, {55.5f, 125.4f},
|
||||
{125.5f, 225.4f}, {225.5f, std::numeric_limits<float>::max()}};
|
||||
static constexpr int PM2_5_GRID[NUM_LEVELS][2] = {{0, 9}, {10, 35}, {36, 55}, {56, 125}, {126, 225}, {226, INT_MAX}};
|
||||
|
||||
static constexpr float PM10_0_GRID[NUM_LEVELS][2] = {{0.0f, 54.0f}, {55.0f, 154.0f},
|
||||
{155.0f, 254.0f}, {255.0f, 354.0f},
|
||||
{355.0f, 424.0f}, {425.0f, std::numeric_limits<float>::max()}};
|
||||
static constexpr int PM10_0_GRID[NUM_LEVELS][2] = {{0, 54}, {55, 154}, {155, 254},
|
||||
{255, 354}, {355, 424}, {425, INT_MAX}};
|
||||
|
||||
static float calculate_index(float value, const float array[NUM_LEVELS][2]) {
|
||||
static int calculate_index(uint16_t value, const int array[NUM_LEVELS][2]) {
|
||||
int grid_index = get_grid_index(value, array);
|
||||
if (grid_index == -1) {
|
||||
return -1.0f;
|
||||
return -1;
|
||||
}
|
||||
float aqi_lo = INDEX_GRID[grid_index][0];
|
||||
float aqi_hi = INDEX_GRID[grid_index][1];
|
||||
float conc_lo = array[grid_index][0];
|
||||
float conc_hi = array[grid_index][1];
|
||||
int aqi_lo = INDEX_GRID[grid_index][0];
|
||||
int aqi_hi = INDEX_GRID[grid_index][1];
|
||||
int conc_lo = array[grid_index][0];
|
||||
int conc_hi = array[grid_index][1];
|
||||
|
||||
return (value - conc_lo) * (aqi_hi - aqi_lo) / (conc_hi - conc_lo) + aqi_lo;
|
||||
}
|
||||
|
||||
static int get_grid_index(float value, const float array[NUM_LEVELS][2]) {
|
||||
static int get_grid_index(uint16_t value, const int array[NUM_LEVELS][2]) {
|
||||
for (int i = 0; i < NUM_LEVELS; i++) {
|
||||
if (value >= array[i][0] && value <= array[i][1]) {
|
||||
return i;
|
||||
|
||||
@@ -44,7 +44,8 @@ void AQISensor::calculate_aqi_() {
|
||||
return;
|
||||
}
|
||||
|
||||
uint16_t aqi = calculator->get_aqi(this->pm_2_5_value_, this->pm_10_0_value_);
|
||||
uint16_t aqi =
|
||||
calculator->get_aqi(static_cast<uint16_t>(this->pm_2_5_value_), static_cast<uint16_t>(this->pm_10_0_value_));
|
||||
this->publish_state(aqi);
|
||||
}
|
||||
|
||||
|
||||
@@ -10,6 +10,7 @@ class AQISensor : public sensor::Sensor, public Component {
|
||||
public:
|
||||
void setup() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override { return setup_priority::DATA; }
|
||||
|
||||
void set_pm_2_5_sensor(sensor::Sensor *sensor) { this->pm_2_5_sensor_ = sensor; }
|
||||
void set_pm_10_0_sensor(sensor::Sensor *sensor) { this->pm_10_0_sensor_ = sensor; }
|
||||
|
||||
@@ -1,18 +1,16 @@
|
||||
#pragma once
|
||||
|
||||
#include <cmath>
|
||||
#include <limits>
|
||||
#include "abstract_aqi_calculator.h"
|
||||
|
||||
namespace esphome::aqi {
|
||||
|
||||
class CAQICalculator : public AbstractAQICalculator {
|
||||
public:
|
||||
uint16_t get_aqi(float pm2_5_value, float pm10_0_value) override {
|
||||
float pm2_5_index = calculate_index(pm2_5_value, PM2_5_GRID);
|
||||
float pm10_0_index = calculate_index(pm10_0_value, PM10_0_GRID);
|
||||
uint16_t get_aqi(uint16_t pm2_5_value, uint16_t pm10_0_value) override {
|
||||
int pm2_5_index = calculate_index(pm2_5_value, PM2_5_GRID);
|
||||
int pm10_0_index = calculate_index(pm10_0_value, PM10_0_GRID);
|
||||
|
||||
return static_cast<uint16_t>(std::round((pm2_5_index < pm10_0_index) ? pm10_0_index : pm2_5_index));
|
||||
return (pm2_5_index < pm10_0_index) ? pm10_0_index : pm2_5_index;
|
||||
}
|
||||
|
||||
protected:
|
||||
@@ -20,27 +18,25 @@ class CAQICalculator : public AbstractAQICalculator {
|
||||
|
||||
static constexpr int INDEX_GRID[NUM_LEVELS][2] = {{0, 25}, {26, 50}, {51, 75}, {76, 100}, {101, 400}};
|
||||
|
||||
static constexpr float PM2_5_GRID[NUM_LEVELS][2] = {
|
||||
{0.0f, 15.0f}, {15.1f, 30.0f}, {30.1f, 55.0f}, {55.1f, 110.0f}, {110.1f, std::numeric_limits<float>::max()}};
|
||||
static constexpr int PM2_5_GRID[NUM_LEVELS][2] = {{0, 15}, {16, 30}, {31, 55}, {56, 110}, {111, 400}};
|
||||
|
||||
static constexpr float PM10_0_GRID[NUM_LEVELS][2] = {
|
||||
{0.0f, 25.0f}, {25.1f, 50.0f}, {50.1f, 90.0f}, {90.1f, 180.0f}, {180.1f, std::numeric_limits<float>::max()}};
|
||||
static constexpr int PM10_0_GRID[NUM_LEVELS][2] = {{0, 25}, {26, 50}, {51, 90}, {91, 180}, {181, 400}};
|
||||
|
||||
static float calculate_index(float value, const float array[NUM_LEVELS][2]) {
|
||||
static int calculate_index(uint16_t value, const int array[NUM_LEVELS][2]) {
|
||||
int grid_index = get_grid_index(value, array);
|
||||
if (grid_index == -1) {
|
||||
return -1.0f;
|
||||
return -1;
|
||||
}
|
||||
|
||||
float aqi_lo = INDEX_GRID[grid_index][0];
|
||||
float aqi_hi = INDEX_GRID[grid_index][1];
|
||||
float conc_lo = array[grid_index][0];
|
||||
float conc_hi = array[grid_index][1];
|
||||
int aqi_lo = INDEX_GRID[grid_index][0];
|
||||
int aqi_hi = INDEX_GRID[grid_index][1];
|
||||
int conc_lo = array[grid_index][0];
|
||||
int conc_hi = array[grid_index][1];
|
||||
|
||||
return (value - conc_lo) * (aqi_hi - aqi_lo) / (conc_hi - conc_lo) + aqi_lo;
|
||||
}
|
||||
|
||||
static int get_grid_index(float value, const float array[NUM_LEVELS][2]) {
|
||||
static int get_grid_index(uint16_t value, const int array[NUM_LEVELS][2]) {
|
||||
for (int i = 0; i < NUM_LEVELS; i++) {
|
||||
if (value >= array[i][0] && value <= array[i][1]) {
|
||||
return i;
|
||||
|
||||
@@ -13,11 +13,14 @@ from . import AQI_CALCULATION_TYPE, CONF_CALCULATION_TYPE, aqi_ns
|
||||
CODEOWNERS = ["@jasstrong"]
|
||||
DEPENDENCIES = ["sensor"]
|
||||
|
||||
UNIT_INDEX = "index"
|
||||
|
||||
AQISensor = aqi_ns.class_("AQISensor", sensor.Sensor, cg.Component)
|
||||
|
||||
CONFIG_SCHEMA = (
|
||||
sensor.sensor_schema(
|
||||
AQISensor,
|
||||
unit_of_measurement=UNIT_INDEX,
|
||||
accuracy_decimals=0,
|
||||
device_class=DEVICE_CLASS_AQI,
|
||||
state_class=STATE_CLASS_MEASUREMENT,
|
||||
|
||||
@@ -41,6 +41,8 @@ void AS3935Component::dump_config() {
|
||||
#endif
|
||||
}
|
||||
|
||||
float AS3935Component::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
void AS3935Component::loop() {
|
||||
if (!this->irq_pin_->digital_read())
|
||||
return;
|
||||
|
||||
@@ -74,6 +74,7 @@ class AS3935Component : public Component {
|
||||
public:
|
||||
void setup() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
void loop() override;
|
||||
|
||||
void set_irq_pin(GPIOPin *irq_pin) { irq_pin_ = irq_pin; }
|
||||
|
||||
@@ -22,6 +22,8 @@ static const uint8_t REGISTER_STATUS = 0x0B; // 8 bytes / R
|
||||
static const uint8_t REGISTER_AGC = 0x1A; // 8 bytes / R
|
||||
static const uint8_t REGISTER_MAGNITUDE = 0x1B; // 16 bytes / R
|
||||
|
||||
float AS5600Sensor::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
void AS5600Sensor::dump_config() {
|
||||
LOG_SENSOR("", "AS5600 Sensor", this);
|
||||
ESP_LOGCONFIG(TAG, " Out of Range Mode: %u", this->out_of_range_mode_);
|
||||
|
||||
@@ -14,6 +14,7 @@ class AS5600Sensor : public PollingComponent, public Parented<AS5600Component>,
|
||||
public:
|
||||
void update() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
|
||||
void set_angle_sensor(sensor::Sensor *angle_sensor) { this->angle_sensor_ = angle_sensor; }
|
||||
void set_raw_angle_sensor(sensor::Sensor *raw_angle_sensor) { this->raw_angle_sensor_ = raw_angle_sensor; }
|
||||
|
||||
@@ -58,6 +58,8 @@ void AS7341Component::dump_config() {
|
||||
LOG_SENSOR(" ", "NIR", this->nir_);
|
||||
}
|
||||
|
||||
float AS7341Component::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
void AS7341Component::update() {
|
||||
this->read_channels(this->channel_readings_);
|
||||
|
||||
|
||||
@@ -78,6 +78,7 @@ class AS7341Component : public PollingComponent, public i2c::I2CDevice {
|
||||
public:
|
||||
void setup() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
void update() override;
|
||||
|
||||
void set_f1_sensor(sensor::Sensor *f1_sensor) { this->f1_ = f1_sensor; }
|
||||
|
||||
@@ -38,10 +38,8 @@ async def to_code(config):
|
||||
# https://github.com/ESP32Async/ESPAsyncTCP
|
||||
cg.add_library("ESP32Async/ESPAsyncTCP", "2.0.0")
|
||||
elif CORE.is_rp2040:
|
||||
# https://github.com/ayushsharma82/RPAsyncTCP
|
||||
# RPAsyncTCP is a drop-in replacement for AsyncTCP_RP2040W with better
|
||||
# ESPAsyncWebServer compatibility
|
||||
cg.add_library("ayushsharma82/RPAsyncTCP", "1.3.2")
|
||||
# https://github.com/khoih-prog/AsyncTCP_RP2040W
|
||||
cg.add_library("khoih-prog/AsyncTCP_RP2040W", "1.2.0")
|
||||
# Other platforms (host, etc) use socket-based implementation
|
||||
|
||||
|
||||
|
||||
@@ -8,8 +8,8 @@
|
||||
// Use ESPAsyncTCP library for ESP8266 (always Arduino)
|
||||
#include <ESPAsyncTCP.h>
|
||||
#elif defined(USE_RP2040)
|
||||
// Use RPAsyncTCP library for RP2040
|
||||
#include <RPAsyncTCP.h>
|
||||
// Use AsyncTCP_RP2040W library for RP2040
|
||||
#include <AsyncTCP_RP2040W.h>
|
||||
#else
|
||||
// Use socket-based implementation for other platforms
|
||||
#include "async_tcp_socket.h"
|
||||
|
||||
@@ -146,6 +146,7 @@ void ATM90E26Component::dump_config() {
|
||||
LOG_SENSOR(" ", "Active Reverse Energy A", this->reverse_active_energy_sensor_);
|
||||
LOG_SENSOR(" ", "Frequency", this->freq_sensor_);
|
||||
}
|
||||
float ATM90E26Component::get_setup_priority() const { return setup_priority::DATA; }
|
||||
|
||||
uint16_t ATM90E26Component::read16_(uint8_t a_register) {
|
||||
uint8_t data[2];
|
||||
|
||||
@@ -13,6 +13,7 @@ class ATM90E26Component : public PollingComponent,
|
||||
public:
|
||||
void setup() override;
|
||||
void dump_config() override;
|
||||
float get_setup_priority() const override;
|
||||
void update() override;
|
||||
|
||||
void set_voltage_sensor(sensor::Sensor *obj) { this->voltage_sensor_ = obj; }
|
||||
|
||||
@@ -108,14 +108,10 @@ void ATM90E32Component::update() {
|
||||
#endif
|
||||
}
|
||||
|
||||
void ATM90E32Component::get_cs_summary_(std::span<char, GPIO_SUMMARY_MAX_LEN> buffer) {
|
||||
this->cs_->dump_summary(buffer.data(), buffer.size());
|
||||
}
|
||||
|
||||
void ATM90E32Component::setup() {
|
||||
this->spi_setup();
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
this->cs_summary_ = this->cs_->dump_summary();
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
|
||||
uint16_t mmode0 = 0x87; // 3P4W 50Hz
|
||||
uint16_t high_thresh = 0;
|
||||
@@ -162,14 +158,12 @@ void ATM90E32Component::setup() {
|
||||
|
||||
if (this->enable_offset_calibration_) {
|
||||
// Initialize flash storage for offset calibrations
|
||||
uint32_t o_hash = fnv1_hash("_offset_calibration_");
|
||||
o_hash = fnv1_hash_extend(o_hash, cs);
|
||||
uint32_t o_hash = fnv1_hash(std::string("_offset_calibration_") + this->cs_summary_);
|
||||
this->offset_pref_ = global_preferences->make_preference<OffsetCalibration[3]>(o_hash, true);
|
||||
this->restore_offset_calibrations_();
|
||||
|
||||
// Initialize flash storage for power offset calibrations
|
||||
uint32_t po_hash = fnv1_hash("_power_offset_calibration_");
|
||||
po_hash = fnv1_hash_extend(po_hash, cs);
|
||||
uint32_t po_hash = fnv1_hash(std::string("_power_offset_calibration_") + this->cs_summary_);
|
||||
this->power_offset_pref_ = global_preferences->make_preference<PowerOffsetCalibration[3]>(po_hash, true);
|
||||
this->restore_power_offset_calibrations_();
|
||||
} else {
|
||||
@@ -189,8 +183,7 @@ void ATM90E32Component::setup() {
|
||||
|
||||
if (this->enable_gain_calibration_) {
|
||||
// Initialize flash storage for gain calibration
|
||||
uint32_t g_hash = fnv1_hash("_gain_calibration_");
|
||||
g_hash = fnv1_hash_extend(g_hash, cs);
|
||||
uint32_t g_hash = fnv1_hash(std::string("_gain_calibration_") + this->cs_summary_);
|
||||
this->gain_calibration_pref_ = global_preferences->make_preference<GainCalibration[3]>(g_hash, true);
|
||||
this->restore_gain_calibrations_();
|
||||
|
||||
@@ -221,8 +214,7 @@ void ATM90E32Component::setup() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::log_calibration_status_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
|
||||
bool offset_mismatch = false;
|
||||
bool power_mismatch = false;
|
||||
@@ -573,8 +565,7 @@ float ATM90E32Component::get_chip_temperature_() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::run_gain_calibrations() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
if (!this->enable_gain_calibration_) {
|
||||
ESP_LOGW(TAG, "[CALIBRATION][%s] Gain calibration is disabled! Enable it first with enable_gain_calibration: true",
|
||||
cs);
|
||||
@@ -674,8 +665,7 @@ void ATM90E32Component::run_gain_calibrations() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::save_gain_calibration_to_memory_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
bool success = this->gain_calibration_pref_.save(&this->gain_phase_);
|
||||
global_preferences->sync();
|
||||
if (success) {
|
||||
@@ -688,8 +678,7 @@ void ATM90E32Component::save_gain_calibration_to_memory_() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::save_offset_calibration_to_memory_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
bool success = this->offset_pref_.save(&this->offset_phase_);
|
||||
global_preferences->sync();
|
||||
if (success) {
|
||||
@@ -705,8 +694,7 @@ void ATM90E32Component::save_offset_calibration_to_memory_() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::save_power_offset_calibration_to_memory_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
bool success = this->power_offset_pref_.save(&this->power_offset_phase_);
|
||||
global_preferences->sync();
|
||||
if (success) {
|
||||
@@ -722,8 +710,7 @@ void ATM90E32Component::save_power_offset_calibration_to_memory_() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::run_offset_calibrations() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
if (!this->enable_offset_calibration_) {
|
||||
ESP_LOGW(TAG,
|
||||
"[CALIBRATION][%s] Offset calibration is disabled! Enable it first with enable_offset_calibration: true",
|
||||
@@ -753,8 +740,7 @@ void ATM90E32Component::run_offset_calibrations() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::run_power_offset_calibrations() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
if (!this->enable_offset_calibration_) {
|
||||
ESP_LOGW(
|
||||
TAG,
|
||||
@@ -827,8 +813,7 @@ void ATM90E32Component::write_power_offsets_to_registers_(uint8_t phase, int16_t
|
||||
}
|
||||
|
||||
void ATM90E32Component::restore_gain_calibrations_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
for (uint8_t i = 0; i < 3; ++i) {
|
||||
this->config_gain_phase_[i].voltage_gain = this->phase_[i].voltage_gain_;
|
||||
this->config_gain_phase_[i].current_gain = this->phase_[i].ct_gain_;
|
||||
@@ -882,8 +867,7 @@ void ATM90E32Component::restore_gain_calibrations_() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::restore_offset_calibrations_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
for (uint8_t i = 0; i < 3; ++i)
|
||||
this->config_offset_phase_[i] = this->offset_phase_[i];
|
||||
|
||||
@@ -925,8 +909,7 @@ void ATM90E32Component::restore_offset_calibrations_() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::restore_power_offset_calibrations_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
for (uint8_t i = 0; i < 3; ++i)
|
||||
this->config_power_offset_phase_[i] = this->power_offset_phase_[i];
|
||||
|
||||
@@ -968,8 +951,7 @@ void ATM90E32Component::restore_power_offset_calibrations_() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::clear_gain_calibrations() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
if (!this->using_saved_calibrations_) {
|
||||
ESP_LOGI(TAG, "[CALIBRATION][%s] No stored gain calibrations to clear. Current values:", cs);
|
||||
ESP_LOGI(TAG, "[CALIBRATION][%s] ----------------------------------------------------------", cs);
|
||||
@@ -1018,8 +1000,7 @@ void ATM90E32Component::clear_gain_calibrations() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::clear_offset_calibrations() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
if (!this->restored_offset_calibration_) {
|
||||
ESP_LOGI(TAG, "[CALIBRATION][%s] No stored offset calibrations to clear. Current values:", cs);
|
||||
ESP_LOGI(TAG, "[CALIBRATION][%s] --------------------------------------------------------------", cs);
|
||||
@@ -1061,8 +1042,7 @@ void ATM90E32Component::clear_offset_calibrations() {
|
||||
}
|
||||
|
||||
void ATM90E32Component::clear_power_offset_calibrations() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
if (!this->restored_power_offset_calibration_) {
|
||||
ESP_LOGI(TAG, "[CALIBRATION][%s] No stored power offsets to clear. Current values:", cs);
|
||||
ESP_LOGI(TAG, "[CALIBRATION][%s] ---------------------------------------------------------------------", cs);
|
||||
@@ -1137,8 +1117,7 @@ int16_t ATM90E32Component::calibrate_power_offset(uint8_t phase, bool reactive)
|
||||
}
|
||||
|
||||
bool ATM90E32Component::verify_gain_writes_() {
|
||||
char cs[GPIO_SUMMARY_MAX_LEN];
|
||||
this->get_cs_summary_(cs);
|
||||
const char *cs = this->cs_summary_.c_str();
|
||||
bool success = true;
|
||||
for (uint8_t phase = 0; phase < 3; phase++) {
|
||||
uint16_t read_voltage = this->read16_(voltage_gain_registers[phase]);
|
||||
|
||||
@@ -1,13 +1,11 @@
|
||||
#pragma once
|
||||
|
||||
#include <span>
|
||||
#include <unordered_map>
|
||||
#include "atm90e32_reg.h"
|
||||
#include "esphome/components/sensor/sensor.h"
|
||||
#include "esphome/components/spi/spi.h"
|
||||
#include "esphome/core/application.h"
|
||||
#include "esphome/core/component.h"
|
||||
#include "esphome/core/gpio.h"
|
||||
#include "esphome/core/helpers.h"
|
||||
#include "esphome/core/preferences.h"
|
||||
|
||||
@@ -184,7 +182,6 @@ class ATM90E32Component : public PollingComponent,
|
||||
bool verify_gain_writes_();
|
||||
bool validate_spi_read_(uint16_t expected, const char *context = nullptr);
|
||||
void log_calibration_status_();
|
||||
void get_cs_summary_(std::span<char, GPIO_SUMMARY_MAX_LEN> buffer);
|
||||
|
||||
struct ATM90E32Phase {
|
||||
uint16_t voltage_gain_{0};
|
||||
@@ -250,6 +247,7 @@ class ATM90E32Component : public PollingComponent,
|
||||
ESPPreferenceObject offset_pref_;
|
||||
ESPPreferenceObject power_offset_pref_;
|
||||
ESPPreferenceObject gain_calibration_pref_;
|
||||
std::string cs_summary_;
|
||||
|
||||
sensor::Sensor *freq_sensor_{nullptr};
|
||||
#ifdef USE_TEXT_SENSOR
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import esphome.codegen as cg
|
||||
from esphome.components.esp32 import add_idf_component, include_builtin_idf_component
|
||||
import esphome.config_validation as cv
|
||||
from esphome.const import CONF_BITS_PER_SAMPLE, CONF_NUM_CHANNELS, CONF_SAMPLE_RATE
|
||||
import esphome.final_validate as fv
|
||||
@@ -166,10 +165,4 @@ def final_validate_audio_schema(
|
||||
|
||||
|
||||
async def to_code(config):
|
||||
# Re-enable ESP-IDF's HTTP client (excluded by default to save compile time)
|
||||
include_builtin_idf_component("esp_http_client")
|
||||
|
||||
add_idf_component(
|
||||
name="esphome/esp-audio-libs",
|
||||
ref="2.0.3",
|
||||
)
|
||||
cg.add_library("esphome/esp-audio-libs", "2.0.1")
|
||||
|
||||
@@ -300,7 +300,7 @@ FileDecoderState AudioDecoder::decode_mp3_() {
|
||||
|
||||
// Advance read pointer to match the offset for the syncword
|
||||
this->input_transfer_buffer_->decrease_buffer_length(offset);
|
||||
const uint8_t *buffer_start = this->input_transfer_buffer_->get_buffer_start();
|
||||
uint8_t *buffer_start = this->input_transfer_buffer_->get_buffer_start();
|
||||
|
||||
buffer_length = (int) this->input_transfer_buffer_->available();
|
||||
int err = esp_audio_libs::helix_decoder::MP3Decode(this->mp3_decoder_, &buffer_start, &buffer_length,
|
||||
|
||||
@@ -185,16 +185,18 @@ esp_err_t AudioReader::start(const std::string &uri, AudioFileType &file_type) {
|
||||
return err;
|
||||
}
|
||||
|
||||
if (str_endswith_ignore_case(url, ".wav")) {
|
||||
std::string url_string = str_lower_case(url);
|
||||
|
||||
if (str_endswith(url_string, ".wav")) {
|
||||
file_type = AudioFileType::WAV;
|
||||
}
|
||||
#ifdef USE_AUDIO_MP3_SUPPORT
|
||||
else if (str_endswith_ignore_case(url, ".mp3")) {
|
||||
else if (str_endswith(url_string, ".mp3")) {
|
||||
file_type = AudioFileType::MP3;
|
||||
}
|
||||
#endif
|
||||
#ifdef USE_AUDIO_FLAC_SUPPORT
|
||||
else if (str_endswith_ignore_case(url, ".flac")) {
|
||||
else if (str_endswith(url_string, ".flac")) {
|
||||
file_type = AudioFileType::FLAC;
|
||||
}
|
||||
#endif
|
||||
|
||||
@@ -6,7 +6,8 @@ namespace bang_bang {
|
||||
|
||||
static const char *const TAG = "bang_bang.climate";
|
||||
|
||||
BangBangClimate::BangBangClimate() = default;
|
||||
BangBangClimate::BangBangClimate()
|
||||
: idle_trigger_(new Trigger<>()), cool_trigger_(new Trigger<>()), heat_trigger_(new Trigger<>()) {}
|
||||
|
||||
void BangBangClimate::setup() {
|
||||
this->sensor_->add_on_state_callback([this](float state) {
|
||||
@@ -159,13 +160,13 @@ void BangBangClimate::switch_to_action_(climate::ClimateAction action) {
|
||||
switch (action) {
|
||||
case climate::CLIMATE_ACTION_OFF:
|
||||
case climate::CLIMATE_ACTION_IDLE:
|
||||
trig = &this->idle_trigger_;
|
||||
trig = this->idle_trigger_;
|
||||
break;
|
||||
case climate::CLIMATE_ACTION_COOLING:
|
||||
trig = &this->cool_trigger_;
|
||||
trig = this->cool_trigger_;
|
||||
break;
|
||||
case climate::CLIMATE_ACTION_HEATING:
|
||||
trig = &this->heat_trigger_;
|
||||
trig = this->heat_trigger_;
|
||||
break;
|
||||
default:
|
||||
trig = nullptr;
|
||||
@@ -203,9 +204,9 @@ void BangBangClimate::set_away_config(const BangBangClimateTargetTempConfig &awa
|
||||
void BangBangClimate::set_sensor(sensor::Sensor *sensor) { this->sensor_ = sensor; }
|
||||
void BangBangClimate::set_humidity_sensor(sensor::Sensor *humidity_sensor) { this->humidity_sensor_ = humidity_sensor; }
|
||||
|
||||
Trigger<> *BangBangClimate::get_idle_trigger() { return &this->idle_trigger_; }
|
||||
Trigger<> *BangBangClimate::get_cool_trigger() { return &this->cool_trigger_; }
|
||||
Trigger<> *BangBangClimate::get_heat_trigger() { return &this->heat_trigger_; }
|
||||
Trigger<> *BangBangClimate::get_idle_trigger() const { return this->idle_trigger_; }
|
||||
Trigger<> *BangBangClimate::get_cool_trigger() const { return this->cool_trigger_; }
|
||||
Trigger<> *BangBangClimate::get_heat_trigger() const { return this->heat_trigger_; }
|
||||
|
||||
void BangBangClimate::set_supports_cool(bool supports_cool) { this->supports_cool_ = supports_cool; }
|
||||
void BangBangClimate::set_supports_heat(bool supports_heat) { this->supports_heat_ = supports_heat; }
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user