mirror of
https://github.com/esphome/esphome.git
synced 2025-11-02 08:01:50 +00:00
Compare commits
136 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0061489a97 | ||
|
|
4dce7fa103 | ||
|
|
467ef9902f | ||
|
|
486174073b | ||
|
|
e9d9de448e | ||
|
|
08c16020c6 | ||
|
|
0ade9baf65 | ||
|
|
2fab7e73b9 | ||
|
|
af4e2bf61d | ||
|
|
74fefea5bb | ||
|
|
21c22fe04a | ||
|
|
70206df8b5 | ||
|
|
15732ca465 | ||
|
|
8e0f4f93d4 | ||
|
|
361baea17f | ||
|
|
8bbfbc4cc1 | ||
|
|
8c5d12df51 | ||
|
|
25c66ed8ca | ||
|
|
7a55521807 | ||
|
|
d04de7baeb | ||
|
|
4aeb756388 | ||
|
|
92b6ed4180 | ||
|
|
9efd9f4fe8 | ||
|
|
34fc2147a4 | ||
|
|
629f2b128e | ||
|
|
81bc400340 | ||
|
|
75628b96a1 | ||
|
|
b1f7ed4fdc | ||
|
|
b8d7185d99 | ||
|
|
2d20a1c0fb | ||
|
|
820067ae5a | ||
|
|
db8313e0d5 | ||
|
|
27a77c685d | ||
|
|
e34365dc7c | ||
|
|
8d395e5338 | ||
|
|
6f54afec00 | ||
|
|
6a24145be6 | ||
|
|
4a2cdbf31c | ||
|
|
1d75ed1ff4 | ||
|
|
76b1c6f47b | ||
|
|
06371c9e2d | ||
|
|
a9c130dd50 | ||
|
|
1f82c1a483 | ||
|
|
cb28429231 | ||
|
|
f2cd2ec178 | ||
|
|
37360bb797 | ||
|
|
6c1dc0f2b3 | ||
|
|
f7671f0c90 | ||
|
|
639b97ccb2 | ||
|
|
0374b3a0b3 | ||
|
|
7ce753b76f | ||
|
|
05a1089ed2 | ||
|
|
71947bb6ac | ||
|
|
ef54e33b70 | ||
|
|
12f20fc3cf | ||
|
|
ffdcddc18e | ||
|
|
85d70eb5a0 | ||
|
|
ffb793177a | ||
|
|
d3f2fab88a | ||
|
|
0fa52d0ce6 | ||
|
|
cf264a2743 | ||
|
|
433b605bef | ||
|
|
6e60c6493a | ||
|
|
4e63bc96d5 | ||
|
|
ce4b339d16 | ||
|
|
ab6d293d0d | ||
|
|
5e5137960d | ||
|
|
8dba37846b | ||
|
|
5cd82d7c25 | ||
|
|
bc8354bad5 | ||
|
|
2abbe1bca3 | ||
|
|
74c70509c2 | ||
|
|
1576e1847e | ||
|
|
9ea9b4b102 | ||
|
|
490743c26e | ||
|
|
1c7bddd005 | ||
|
|
5c39f73fda | ||
|
|
2fc78a1b33 | ||
|
|
03249780fd | ||
|
|
5170a7cdf4 | ||
|
|
5680de79a9 | ||
|
|
61bd2a3a44 | ||
|
|
7802c63f5a | ||
|
|
94bef2a5a4 | ||
|
|
12c4b0788c | ||
|
|
b13cc3a4a5 | ||
|
|
c5d9bc5452 | ||
|
|
6c6d21a7ab | ||
|
|
9bb06782b2 | ||
|
|
a827b51887 | ||
|
|
2c30d80490 | ||
|
|
e0acdc3ae0 | ||
|
|
36a3f96011 | ||
|
|
6ae8e3495f | ||
|
|
6aa449115f | ||
|
|
0be29d27d5 | ||
|
|
68fa7489a2 | ||
|
|
d7699c93d6 | ||
|
|
a04438e924 | ||
|
|
e063f2aaea | ||
|
|
7b630bfb8b | ||
|
|
ec3366cce0 | ||
|
|
135117714b | ||
|
|
91e6304505 | ||
|
|
361a9da868 | ||
|
|
31d7656c07 | ||
|
|
d1a7751dc9 | ||
|
|
e47bcb9abb | ||
|
|
30b94f06b3 | ||
|
|
19dfdb77eb | ||
|
|
b6e7ad3589 | ||
|
|
1267680379 | ||
|
|
b5f6b32fad | ||
|
|
7068808d4f | ||
|
|
c82a44d541 | ||
|
|
e650473682 | ||
|
|
2ee0d4242d | ||
|
|
b2aecf29dc | ||
|
|
081c2f4e07 | ||
|
|
0c77228421 | ||
|
|
8a509c6551 | ||
|
|
d249820bcd | ||
|
|
f39cf52eae | ||
|
|
63fb252b46 | ||
|
|
2f98adca49 | ||
|
|
084fc00517 | ||
|
|
abeac23abd | ||
|
|
7264e3a4bf | ||
|
|
16eeb3af31 | ||
|
|
b799d2df7f | ||
|
|
11d55fec4f | ||
|
|
923a5990c7 | ||
|
|
a0a615c335 | ||
|
|
e9ced0cc3f | ||
|
|
1a92381994 | ||
|
|
a07a7a87a2 |
@@ -106,3 +106,5 @@ venv.bak/
|
||||
config/
|
||||
examples/
|
||||
Dockerfile
|
||||
.git/
|
||||
tests/build/
|
||||
|
||||
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
47
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
|
||||
---
|
||||
|
||||
<!-- Thanks for reporting a bug for this project. READ THIS FIRST:
|
||||
- Please make sure to submit issues in the right GitHub repository, if unsure just post it here:
|
||||
- esphomeyaml [here] - This is mostly for reporting bugs when compiling and when you get a long stack trace while compiling or if a configuration fails to validate.
|
||||
- esphomelib [https://github.com/OttoWinter/esphomelib] - Report bugs there if the ESP is crashing or a feature is not working as expected.
|
||||
- esphomedocs [https://github.com/OttoWinter/esphomedocs] - Report bugs there if the documentation is wrong/outdated.
|
||||
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks (```). Do not delete any text from this template!
|
||||
-->
|
||||
|
||||
**Operating environment (Hass.io/Docker/pip/etc.):**
|
||||
<!--
|
||||
Please provide details about your environment.
|
||||
-->
|
||||
|
||||
**ESP (ESP32/ESP8266/Board/Sonoff):**
|
||||
<!--
|
||||
Please provide details about which ESP you're using.
|
||||
-->
|
||||
|
||||
**Affected component:**
|
||||
<!--
|
||||
Please add the link to the documentation at https://esphomelib.com/esphomeyaml/index.html of the component in question.
|
||||
-->
|
||||
|
||||
|
||||
**Description of problem:**
|
||||
|
||||
|
||||
**Problem-relevant YAML-configuration entries:**
|
||||
```yaml
|
||||
|
||||
```
|
||||
|
||||
**Traceback (if applicable):**
|
||||
<!--
|
||||
Please copy the traceback here if compilation is failing. If possible, also connect to the ESP and copy its logs into the backticks.
|
||||
-->
|
||||
```
|
||||
|
||||
```
|
||||
|
||||
**Additional information:**
|
||||
22
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
22
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
|
||||
---
|
||||
|
||||
<!-- READ THIS FIRST:
|
||||
- This is for feature requests only, if you want to have a certain new sensor/module supported, please use the "new integration" template.
|
||||
- Please be as descriptive as possible, especially use-cases that can otherwise not be solved boost the problem's priority.
|
||||
-->
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
<!--
|
||||
A clear and concise description of what the problem is.
|
||||
-->
|
||||
Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A description of what you want to happen.
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the feature request here.
|
||||
20
.github/ISSUE_TEMPLATE/new-integration.md
vendored
Normal file
20
.github/ISSUE_TEMPLATE/new-integration.md
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
name: New integration
|
||||
about: Suggest a new integration for esphomelib
|
||||
|
||||
---
|
||||
|
||||
<!-- READ THIS FIRST:
|
||||
- This is for new integrations (such as new sensors/modules) only, for new features within the environment please use the "feature request" template.
|
||||
- Do not delete anything from this template and fill out the form as precisely as possible.
|
||||
-->
|
||||
|
||||
**What new integration would you wish to have?**
|
||||
<!-- A name/description of the new integration/board. -->
|
||||
|
||||
**If possible, provide a link to an existing library for the integration:**
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Additional context**
|
||||
20
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
20
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
## Description:
|
||||
|
||||
|
||||
**Related issue (if applicable):** fixes <link to issue>
|
||||
|
||||
**Pull request in [esphomedocs](https://github.com/OttoWinter/esphomedocs) with documentation (if applicable):** OttoWinter/esphomedocs#<esphomedocs PR number goes here>
|
||||
**Pull request in [esphomelib](https://github.com/OttoWinter/esphomelib) with C++ framework changes (if applicable):** OttoWinter/esphomelib#<esphomelib PR number goes here>
|
||||
|
||||
## Example entry for YAML configuration (if applicable):
|
||||
```yaml
|
||||
|
||||
```
|
||||
|
||||
## Checklist:
|
||||
- [ ] The code change is tested and works locally.
|
||||
- [ ] Tests have been added to verify that the new code works (under `tests/` folder).
|
||||
- [ ] Check this box if you have read, understand, comply, and agree with the [Code of Conduct](https://github.com/OttoWinter/esphomeyaml/blob/master/CODE_OF_CONDUCT.md).
|
||||
|
||||
If user exposed functionality or configuration variables are added/changed:
|
||||
- [ ] Documentation added/updated in [esphomedocs](https://github.com/OttoWinter/esphomedocs).
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -104,3 +104,4 @@ venv.bak/
|
||||
.mypy_cache/
|
||||
|
||||
config/
|
||||
tests/build/
|
||||
|
||||
233
.gitlab-ci.yml
Normal file
233
.gitlab-ci.yml
Normal file
@@ -0,0 +1,233 @@
|
||||
---
|
||||
# Based on https://gitlab.com/hassio-addons/addon-node-red/blob/master/.gitlab-ci.yml
|
||||
variables:
|
||||
DOCKER_DRIVER: overlay2
|
||||
|
||||
stages:
|
||||
- lint
|
||||
- test
|
||||
- build
|
||||
- deploy
|
||||
|
||||
.lint: &lint
|
||||
stage: lint
|
||||
tags:
|
||||
- python2.7
|
||||
- esphomeyaml-lint
|
||||
|
||||
.test: &test
|
||||
stage: test
|
||||
before_script:
|
||||
- pip install -e .
|
||||
tags:
|
||||
- python2.7
|
||||
- esphomeyaml-test
|
||||
variables:
|
||||
TZ: UTC
|
||||
cache:
|
||||
paths:
|
||||
- tests/build
|
||||
|
||||
.docker-builder: &docker-builder
|
||||
before_script:
|
||||
- docker info
|
||||
- docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" "$CI_REGISTRY"
|
||||
services:
|
||||
- docker:dind
|
||||
tags:
|
||||
- hassio-builder
|
||||
|
||||
flake8:
|
||||
<<: *lint
|
||||
script:
|
||||
- flake8 esphomeyaml
|
||||
|
||||
pylint:
|
||||
<<: *lint
|
||||
script:
|
||||
- pylint esphomeyaml
|
||||
|
||||
test1:
|
||||
<<: *test
|
||||
script:
|
||||
- esphomeyaml tests/test1.yaml compile
|
||||
|
||||
test2:
|
||||
<<: *test
|
||||
script:
|
||||
- esphomeyaml tests/test2.yaml compile
|
||||
|
||||
.build-hassio: &build-hassio
|
||||
<<: *docker-builder
|
||||
stage: build
|
||||
script:
|
||||
- |
|
||||
hassio-builder.sh \
|
||||
-t . \
|
||||
-i ottowinter/esphomeyaml-hassio-${ADDON_ARCH} \
|
||||
-d "$CI_REGISTRY" \
|
||||
--${ADDON_ARCH}
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:dev" \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:dev"
|
||||
retry: 2
|
||||
|
||||
# Generic deploy template
|
||||
.deploy-release: &deploy-release
|
||||
<<: *docker-builder
|
||||
stage: deploy
|
||||
script:
|
||||
- version=${CI_COMMIT_TAG:1}
|
||||
- echo "Publishing release version ${version}"
|
||||
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
||||
- docker pull "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- |
|
||||
docker tag \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:latest"
|
||||
only:
|
||||
- /^v\d+\.\d+\.\d+$/
|
||||
except:
|
||||
- /^(?!master).+@/
|
||||
|
||||
.deploy-beta: &deploy-beta
|
||||
<<: *docker-builder
|
||||
stage: deploy
|
||||
script:
|
||||
- version=${CI_COMMIT_TAG:1}
|
||||
- echo "Publishing beta version ${version}"
|
||||
- docker login -u "$DOCKER_USER" -p "$DOCKER_PASSWORD"
|
||||
- docker pull "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${CI_COMMIT_SHA}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- |
|
||||
docker tag \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}" \
|
||||
"ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:beta"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:${version}"
|
||||
- docker push "ottowinter/esphomeyaml-hassio-${ADDON_ARCH}:rc"
|
||||
only:
|
||||
- /^v\d+\.\d+\.\d+b\d+$/
|
||||
except:
|
||||
- /^(?!rc).+@/
|
||||
|
||||
# Build jobs
|
||||
build:normal:
|
||||
<<: *docker-builder
|
||||
stage: build
|
||||
script:
|
||||
- docker build -t "${CI_REGISTRY}/ottowinter/esphomeyaml:dev" .
|
||||
- |
|
||||
docker tag \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml:dev" \
|
||||
"${CI_REGISTRY}/ottowinter/esphomeyaml:${CI_COMMIT_SHA}"
|
||||
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml:${CI_COMMIT_SHA}"
|
||||
- docker push "${CI_REGISTRY}/ottowinter/esphomeyaml:dev"
|
||||
|
||||
build:armhf:
|
||||
<<: *build-hassio
|
||||
variables:
|
||||
ADDON_ARCH: armhf
|
||||
|
||||
#build:aarch64:
|
||||
# <<: *build
|
||||
# variables:
|
||||
# ADDON_ARCH: aarch64
|
||||
|
||||
build:i386:
|
||||
<<: *build-hassio
|
||||
variables:
|
||||
ADDON_ARCH: i386
|
||||
|
||||
build:amd64:
|
||||
<<: *build-hassio
|
||||
variables:
|
||||
ADDON_ARCH: amd64
|
||||
|
||||
# Deploy jobs
|
||||
deploy-release:armhf:
|
||||
<<: *deploy-release
|
||||
variables:
|
||||
ADDON_ARCH: armhf
|
||||
|
||||
deploy-beta:armhf:
|
||||
<<: *deploy-beta
|
||||
variables:
|
||||
ADDON_ARCH: armhf
|
||||
|
||||
#deploy-release:aarch64:
|
||||
# <<: *deploy-release
|
||||
# variables:
|
||||
# ADDON_ARCH: aarch64
|
||||
|
||||
#deploy-beta:aarch64:
|
||||
# <<: *deploy-beta
|
||||
# variables:
|
||||
# ADDON_ARCH: aarch64
|
||||
|
||||
deploy-release:i386:
|
||||
<<: *deploy-release
|
||||
variables:
|
||||
ADDON_ARCH: i386
|
||||
|
||||
deploy-beta:i386:
|
||||
<<: *deploy-beta
|
||||
variables:
|
||||
ADDON_ARCH: i386
|
||||
|
||||
deploy-release:amd64:
|
||||
<<: *deploy-release
|
||||
variables:
|
||||
ADDON_ARCH: amd64
|
||||
|
||||
deploy-beta:amd64:
|
||||
<<: *deploy-beta
|
||||
variables:
|
||||
ADDON_ARCH: amd64
|
||||
|
||||
.deploy-pypi: &deploy-pypi
|
||||
stage: deploy
|
||||
before_script:
|
||||
- pip install -e .
|
||||
- pip install twine
|
||||
script:
|
||||
- python setup.py sdist
|
||||
- twine upload dist/*
|
||||
tags:
|
||||
- python2.7
|
||||
- esphomeyaml-test
|
||||
|
||||
deploy-release:pypi:
|
||||
<<: *deploy-pypi
|
||||
only:
|
||||
- /^v\d+\.\d+\.\d+$/
|
||||
except:
|
||||
- /^(?!master).+@/
|
||||
|
||||
deploy-beta:pypi:
|
||||
<<: *deploy-pypi
|
||||
only:
|
||||
- /^v\d+\.\d+\.\d+b\d+$/
|
||||
except:
|
||||
- /^(?!rc).+@/
|
||||
22
.travis.yml
22
.travis.yml
@@ -2,9 +2,19 @@ sudo: false
|
||||
language: python
|
||||
python:
|
||||
- "2.7"
|
||||
install:
|
||||
- pip install -r requirements.txt
|
||||
- pip install tornado esptool flake8==3.5.0 pylint==1.8.4
|
||||
script:
|
||||
- flake8 esphomeyaml
|
||||
- pylint esphomeyaml
|
||||
jobs:
|
||||
include:
|
||||
- name: "Lint"
|
||||
install:
|
||||
- pip install -r requirements.txt
|
||||
- pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
|
||||
script:
|
||||
- flake8 esphomeyaml
|
||||
- pylint esphomeyaml
|
||||
- name: "Test"
|
||||
install:
|
||||
- pip install -e .
|
||||
- pip install tzlocal pillow
|
||||
script:
|
||||
- esphomeyaml tests/test1.yaml compile
|
||||
- esphomeyaml tests/test2.yaml compile
|
||||
|
||||
46
CODE_OF_CONDUCT.md
Normal file
46
CODE_OF_CONDUCT.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# Contributor Covenant Code of Conduct
|
||||
|
||||
## Our Pledge
|
||||
|
||||
In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation.
|
||||
|
||||
## Our Standards
|
||||
|
||||
Examples of behavior that contributes to creating a positive environment include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
|
||||
Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.
|
||||
|
||||
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
|
||||
|
||||
## Scope
|
||||
|
||||
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.
|
||||
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at contact@otto-winter.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
|
||||
|
||||
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
|
||||
|
||||
## Attribution
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version]
|
||||
|
||||
[homepage]: http://contributor-covenant.org
|
||||
[version]: http://contributor-covenant.org/version/1/4/
|
||||
18
CONTRIBUTING.md
Normal file
18
CONTRIBUTING.md
Normal file
@@ -0,0 +1,18 @@
|
||||
# Contributing to esphomeyaml
|
||||
|
||||
esphomeyaml is a part of esphomelib and is responsible for reading in YAML configuration files,
|
||||
converting them to C++ code. This code is then converted to a platformio project and compiled
|
||||
with [esphomelib](https://github.com/OttoWinter/esphomelib), the C++ framework behind the project.
|
||||
|
||||
For a detailed guide, please see https://esphomelib.com/esphomeyaml/guides/contributing.html#contributing-to-esphomeyaml
|
||||
|
||||
Things to note when contributing:
|
||||
|
||||
- Please test your changes :)
|
||||
- If a new feature is added or an existing user-facing feature is changed, you should also
|
||||
update the [docs](https://github.com/OttoWinter/esphomedocs). See [contributing to esphomedocs](https://esphomelib.com/esphomeyaml/guides/contributing.html#contributing-to-esphomedocs)
|
||||
for more information.
|
||||
- Please also update the tests in the `tests/` folder. You can do so by just adding a line in one of the YAML files
|
||||
which checks if your new feature compiles correctly.
|
||||
- Sometimes I will let pull requests linger because I'm not 100% sure about them. Please feel free to ping
|
||||
me after some time.
|
||||
13
Dockerfile
13
Dockerfile
@@ -1,21 +1,26 @@
|
||||
FROM python:2.7
|
||||
MAINTAINER Otto Winter <contact@otto-winter.com>
|
||||
|
||||
RUN apt-get update && apt-get install -y \
|
||||
python-pil \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
ENV ESPHOMEYAML_OTA_HOST_PORT=6123
|
||||
EXPOSE 6123
|
||||
VOLUME /config
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
COPY requirements.txt /usr/src/app/
|
||||
RUN pip install --no-cache-dir -r requirements.txt && \
|
||||
pip install --no-cache-dir tornado esptool
|
||||
RUN pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
COPY docker/platformio.ini /usr/src/app/
|
||||
RUN platformio settings set enable_telemetry No && \
|
||||
platformio run -e espressif32 -e espressif8266; exit 0
|
||||
|
||||
COPY . .
|
||||
RUN pip install -e .
|
||||
RUN pip install --no-cache-dir -e . && \
|
||||
pip install --no-cache-dir tzlocal pillow
|
||||
|
||||
WORKDIR /config
|
||||
ENTRYPOINT ["esphomeyaml"]
|
||||
CMD ["/config", "dashboard"]
|
||||
|
||||
21
LICENSE
Normal file
21
LICENSE
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2018 Otto Winter
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
@@ -1,10 +1,10 @@
|
||||
# esphomeyaml for [esphomelib](https://github.com/OttoWinter/esphomelib)
|
||||
|
||||
### Getting Started Guide: https://esphomelib.com/esphomeyaml/getting-started.html
|
||||
### Getting Started Guide: https://esphomelib.com/esphomeyaml/guides/getting_started_command_line.html
|
||||
|
||||
### Available Components: https://esphomelib.com/esphomeyaml/index.html
|
||||
|
||||
esphomeyaml is the solution for your ESP8266/ESP32 projects with Home Assistant. It allows you to create **custom firmwares** for your microcontrollers with no programming experience required. All you need to know is the YAML configuration format which is also used by Home Assistant.
|
||||
esphomeyaml is the solution for your ESP8266/ESP32 projects with Home Assistant. It allows you to create **custom firmwares** for your microcontrollers with no programming experience required. All you need to know is the YAML configuration format which is also used by [Home Assistant](https://www.home-assistant.io).
|
||||
|
||||
esphomeyaml will:
|
||||
|
||||
@@ -26,7 +26,7 @@ esphomeyaml configuration.yaml run
|
||||
files like you're used to with Home Assistant.
|
||||
* **Flexible:** Use [esphomelib](https://github.com/OttoWinter/esphomelib)'s powerful core to create custom sensors/outputs.
|
||||
* **Fast and efficient:** Written in C++ and keeps memory consumption to a minimum.
|
||||
* **Made for Home Assistant:** Almost all Home Assistant features are supported out of the box. Including RGB lights and many more.
|
||||
* **Made for [Home Assistant](https://www.home-assistant.io):** Almost all [Home Assistant](https://www.home-assistant.io) features are supported out of the box. Including RGB lights and many more.
|
||||
* **Easy reproducible configuration:** No need to go through a long setup process for every single node. Just copy a configuration file and run a single command.
|
||||
* **Smart Over The Air Updates:** esphomeyaml has OTA updates deeply integrated into the system. It even automatically enters a recovery mode if a boot loop is detected.
|
||||
* **Powerful logging engine:** View colorful logs and debug issues remotely.
|
||||
|
||||
21
docker/Dockerfile.aarch64
Normal file
21
docker/Dockerfile.aarch64
Normal file
@@ -0,0 +1,21 @@
|
||||
# Dockerfile for aarch64 version of HassIO add-on
|
||||
FROM arm64v8/ubuntu:bionic
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python \
|
||||
python-pip \
|
||||
python-setuptools \
|
||||
python-pil \
|
||||
git \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
COPY docker/platformio.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
COPY . .
|
||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
||||
pip install --no-cache-dir --no-binary :all: tzlocal
|
||||
|
||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
||||
21
docker/Dockerfile.amd64
Normal file
21
docker/Dockerfile.amd64
Normal file
@@ -0,0 +1,21 @@
|
||||
# Dockerfile for amd64 version of HassIO add-on
|
||||
FROM ubuntu:bionic
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python \
|
||||
python-pip \
|
||||
python-setuptools \
|
||||
python-pil \
|
||||
git \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
COPY docker/platformio.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
COPY . .
|
||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
||||
pip install --no-cache-dir --no-binary :all: tzlocal
|
||||
|
||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
||||
31
docker/Dockerfile.armhf
Normal file
31
docker/Dockerfile.armhf
Normal file
@@ -0,0 +1,31 @@
|
||||
# Dockerfile for armhf version of HassIO add-on
|
||||
FROM homeassistant/armhf-base:latest
|
||||
|
||||
RUN apk add --no-cache \
|
||||
python2 \
|
||||
python2-dev \
|
||||
py2-pip \
|
||||
git \
|
||||
gcc \
|
||||
openssh \
|
||||
libc6-compat \
|
||||
jpeg-dev \
|
||||
zlib-dev \
|
||||
freetype-dev \
|
||||
lcms2-dev \
|
||||
openjpeg-dev \
|
||||
tiff-dev \
|
||||
libc-dev \
|
||||
linux-headers \
|
||||
&& \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
COPY docker/platformio-esp8266.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
COPY . .
|
||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
||||
pip install --no-cache-dir pillow tzlocal
|
||||
|
||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
||||
32
docker/Dockerfile.builder
Normal file
32
docker/Dockerfile.builder
Normal file
@@ -0,0 +1,32 @@
|
||||
FROM multiarch/ubuntu-core:amd64-xenial
|
||||
|
||||
# setup locals
|
||||
RUN apt-get update && apt-get install -y \
|
||||
jq \
|
||||
git \
|
||||
python3-setuptools \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
ENV LANG C.UTF-8
|
||||
|
||||
# Install docker
|
||||
# https://docs.docker.com/engine/installation/linux/docker-ce/ubuntu/
|
||||
RUN apt-get update && apt-get install -y \
|
||||
apt-transport-https \
|
||||
ca-certificates \
|
||||
curl \
|
||||
software-properties-common \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& curl -fsSL https://download.docker.com/linux/ubuntu/gpg | apt-key add - \
|
||||
&& add-apt-repository "deb https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" \
|
||||
&& apt-get update && apt-get install -y docker-ce \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# setup arm binary support
|
||||
RUN apt-get update && apt-get install -y \
|
||||
qemu-user-static \
|
||||
binfmt-support \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY docker/hassio-builder.sh /usr/bin/
|
||||
|
||||
WORKDIR /data
|
||||
21
docker/Dockerfile.i386
Normal file
21
docker/Dockerfile.i386
Normal file
@@ -0,0 +1,21 @@
|
||||
# Dockerfile for i386 version of HassIO add-on
|
||||
FROM i386/ubuntu:bionic
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python \
|
||||
python-pip \
|
||||
python-setuptools \
|
||||
python-pil \
|
||||
git \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
COPY docker/platformio.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
COPY . .
|
||||
RUN pip install --no-cache-dir --no-binary :all: -e . && \
|
||||
pip install --no-cache-dir --no-binary :all: tzlocal
|
||||
|
||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
||||
6
docker/Dockerfile.lint
Normal file
6
docker/Dockerfile.lint
Normal file
@@ -0,0 +1,6 @@
|
||||
FROM python:2.7
|
||||
|
||||
COPY requirements.txt /requirements.txt
|
||||
|
||||
RUN pip install -r /requirements.txt && \
|
||||
pip install flake8==3.5.0 pylint==1.9.3 tzlocal pillow
|
||||
19
docker/Dockerfile.test
Normal file
19
docker/Dockerfile.test
Normal file
@@ -0,0 +1,19 @@
|
||||
FROM ubuntu:bionic
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python \
|
||||
python-pip \
|
||||
python-setuptools \
|
||||
python-pil \
|
||||
git \
|
||||
&& apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/*rm -rf /var/lib/apt/lists/* /tmp/* && \
|
||||
pip install --no-cache-dir --no-binary :all: platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
COPY docker/platformio.ini /pio/platformio.ini
|
||||
RUN platformio run -d /pio; rm -rf /pio
|
||||
|
||||
COPY requirements.txt /requirements.txt
|
||||
|
||||
RUN pip install --no-cache-dir -r /requirements.txt && \
|
||||
pip install --no-cache-dir tzlocal pillow
|
||||
318
docker/hassio-builder.sh
Executable file
318
docker/hassio-builder.sh
Executable file
@@ -0,0 +1,318 @@
|
||||
#!/usr/bin/env bash
|
||||
# Based on Home Assistant's docker builder
|
||||
######################
|
||||
# Hass.io Build-env
|
||||
######################
|
||||
set -e
|
||||
|
||||
echo -- "$@"
|
||||
|
||||
#### Variable ####
|
||||
|
||||
DOCKER_TIMEOUT=20
|
||||
DOCKER_PID=-1
|
||||
DOCKER_HUB=""
|
||||
DOCKER_CACHE="true"
|
||||
DOCKER_LOCAL="false"
|
||||
TARGET=""
|
||||
IMAGE=""
|
||||
BUILD_LIST=()
|
||||
BUILD_TASKS=()
|
||||
|
||||
#### Misc functions ####
|
||||
|
||||
function print_help() {
|
||||
cat << EOF
|
||||
Hass.io build-env for ecosystem:
|
||||
docker run --rm homeassistant/{arch}-builder:latest [options]
|
||||
|
||||
Options:
|
||||
-h, --help
|
||||
Display this help and exit.
|
||||
|
||||
Repository / Data
|
||||
-t, --target <PATH_TO_BUILD>
|
||||
Set local folder or path inside repository for build.
|
||||
|
||||
Version/Image handling
|
||||
-i, --image <IMAGE_NAME>
|
||||
Overwrite image name of build / support {arch}
|
||||
|
||||
Architecture
|
||||
--armhf
|
||||
Build for arm.
|
||||
--amd64
|
||||
Build for intel/amd 64bit.
|
||||
--aarch64
|
||||
Build for arm 64bit.
|
||||
--i386
|
||||
Build for intel/amd 32bit.
|
||||
--all
|
||||
Build all architecture.
|
||||
|
||||
Build handling
|
||||
--no-cache
|
||||
Disable cache for the build (from latest).
|
||||
-d, --docker-hub <DOCKER_REPOSITORY>
|
||||
Set or overwrite the docker repository.
|
||||
|
||||
Use the host docker socket if mapped into container:
|
||||
/var/run/docker.sock
|
||||
|
||||
EOF
|
||||
|
||||
exit 1
|
||||
}
|
||||
|
||||
#### Docker functions ####
|
||||
|
||||
function start_docker() {
|
||||
local starttime
|
||||
local endtime
|
||||
|
||||
if [ -S "/var/run/docker.sock" ]; then
|
||||
echo "[INFO] Use host docker setup with '/var/run/docker.sock'"
|
||||
DOCKER_LOCAL="true"
|
||||
return 0
|
||||
fi
|
||||
|
||||
echo "[INFO] Starting docker."
|
||||
dockerd 2> /dev/null &
|
||||
DOCKER_PID=$!
|
||||
|
||||
echo "[INFO] Waiting for docker to initialize..."
|
||||
starttime="$(date +%s)"
|
||||
endtime="$(date +%s)"
|
||||
until docker info >/dev/null 2>&1; do
|
||||
if [ $((endtime - starttime)) -le ${DOCKER_TIMEOUT} ]; then
|
||||
sleep 1
|
||||
endtime=$(date +%s)
|
||||
else
|
||||
echo "[ERROR] Timeout while waiting for docker to come up"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
echo "[INFO] Docker was initialized"
|
||||
}
|
||||
|
||||
|
||||
function stop_docker() {
|
||||
local starttime
|
||||
local endtime
|
||||
|
||||
if [ "$DOCKER_LOCAL" == "true" ]; then
|
||||
return 0
|
||||
fi
|
||||
|
||||
echo "[INFO] Stopping in container docker..."
|
||||
if [ "$DOCKER_PID" -gt 0 ] && kill -0 "$DOCKER_PID" 2> /dev/null; then
|
||||
starttime="$(date +%s)"
|
||||
endtime="$(date +%s)"
|
||||
|
||||
# Now wait for it to die
|
||||
kill "$DOCKER_PID"
|
||||
while kill -0 "$DOCKER_PID" 2> /dev/null; do
|
||||
if [ $((endtime - starttime)) -le ${DOCKER_TIMEOUT} ]; then
|
||||
sleep 1
|
||||
endtime=$(date +%s)
|
||||
else
|
||||
echo "[ERROR] Timeout while waiting for container docker to die"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo "[WARN] Your host might have been left with unreleased resources"
|
||||
fi
|
||||
}
|
||||
|
||||
function run_build() {
|
||||
local build_dir=$1
|
||||
local repository=$2
|
||||
local image=$3
|
||||
local version=$4
|
||||
local build_arch=$5
|
||||
local docker_cli=("${!6}")
|
||||
|
||||
local push_images=()
|
||||
|
||||
# Overwrites
|
||||
if [ ! -z "$DOCKER_HUB" ]; then repository="$DOCKER_HUB"; fi
|
||||
if [ ! -z "$IMAGE" ]; then image="$IMAGE"; fi
|
||||
|
||||
# Init Cache
|
||||
if [ "$DOCKER_CACHE" == "true" ]; then
|
||||
echo "[INFO] Init cache for $repository/$image:$version"
|
||||
if docker pull "$repository/$image:latest" > /dev/null 2>&1; then
|
||||
docker_cli+=("--cache-from" "$repository/$image:latest")
|
||||
else
|
||||
docker_cli+=("--no-cache")
|
||||
echo "[WARN] No cache image found. Cache is disabled for build"
|
||||
fi
|
||||
else
|
||||
docker_cli+=("--no-cache")
|
||||
fi
|
||||
|
||||
# Build image
|
||||
echo "[INFO] Run build for $repository/$image:$version"
|
||||
docker build --pull -t "$repository/$image:$version" \
|
||||
--label "io.hass.version=$version" \
|
||||
--label "io.hass.arch=$build_arch" \
|
||||
-f "$TARGET/docker/Dockerfile.$build_arch" \
|
||||
"${docker_cli[@]}" \
|
||||
"$build_dir"
|
||||
|
||||
echo "[INFO] Finish build for $repository/$image:$version"
|
||||
docker tag "$repository/$image:$version" "$repository/$image:dev"
|
||||
}
|
||||
|
||||
|
||||
#### HassIO functions ####
|
||||
|
||||
function build_addon() {
|
||||
local build_arch=$1
|
||||
|
||||
local docker_cli=()
|
||||
local image=""
|
||||
local repository=""
|
||||
local raw_image=""
|
||||
local name=""
|
||||
local description=""
|
||||
local url=""
|
||||
local args=""
|
||||
|
||||
# Read addon config.json
|
||||
name="$(jq --raw-output '.name // empty' "$TARGET/esphomeyaml/config.json" | sed "s/'//g")"
|
||||
description="$(jq --raw-output '.description // empty' "$TARGET/esphomeyaml/config.json" | sed "s/'//g")"
|
||||
url="$(jq --raw-output '.url // empty' "$TARGET/esphomeyaml/config.json")"
|
||||
version="$(jq --raw-output '.version' "$TARGET/esphomeyaml/config.json")"
|
||||
raw_image="$(jq --raw-output '.image // empty' "$TARGET/esphomeyaml/config.json")"
|
||||
|
||||
# Read data from image
|
||||
if [ ! -z "$raw_image" ]; then
|
||||
repository="$(echo "$raw_image" | cut -f 1 -d '/')"
|
||||
image="$(echo "$raw_image" | cut -f 2 -d '/')"
|
||||
fi
|
||||
|
||||
# Set additional labels
|
||||
docker_cli+=("--label" "io.hass.name=$name")
|
||||
docker_cli+=("--label" "io.hass.description=$description")
|
||||
docker_cli+=("--label" "io.hass.type=addon")
|
||||
|
||||
if [ ! -z "$url" ]; then
|
||||
docker_cli+=("--label" "io.hass.url=$url")
|
||||
fi
|
||||
|
||||
# Start build
|
||||
run_build "$TARGET" "$repository" "$image" "$version" \
|
||||
"$build_arch" docker_cli[@]
|
||||
}
|
||||
|
||||
#### initialized cross-build ####
|
||||
|
||||
function init_crosscompile() {
|
||||
echo "[INFO] Setup crosscompiling feature"
|
||||
(
|
||||
mount binfmt_misc -t binfmt_misc /proc/sys/fs/binfmt_misc
|
||||
update-binfmts --enable qemu-arm
|
||||
update-binfmts --enable qemu-aarch64
|
||||
) > /dev/null 2>&1 || echo "[WARN] Can't enable crosscompiling feature"
|
||||
}
|
||||
|
||||
|
||||
function clean_crosscompile() {
|
||||
echo "[INFO] Clean crosscompiling feature"
|
||||
if [ -f /proc/sys/fs/binfmt_misc ]; then
|
||||
umount /proc/sys/fs/binfmt_misc || true
|
||||
fi
|
||||
|
||||
(
|
||||
update-binfmts --disable qemu-arm
|
||||
update-binfmts --disable qemu-aarch64
|
||||
) > /dev/null 2>&1 || echo "[WARN] No crosscompiling feature found for cleanup"
|
||||
}
|
||||
|
||||
#### Error handling ####
|
||||
|
||||
function error_handling() {
|
||||
stop_docker
|
||||
clean_crosscompile
|
||||
|
||||
exit 1
|
||||
}
|
||||
trap 'error_handling' SIGINT SIGTERM
|
||||
|
||||
#### Parse arguments ####
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
key=$1
|
||||
case ${key} in
|
||||
-h|--help)
|
||||
print_help
|
||||
;;
|
||||
-t|--target)
|
||||
TARGET=$2
|
||||
shift
|
||||
;;
|
||||
-i|--image)
|
||||
IMAGE=$2
|
||||
shift
|
||||
;;
|
||||
--no-cache)
|
||||
DOCKER_CACHE="false"
|
||||
;;
|
||||
-d|--docker-hub)
|
||||
DOCKER_HUB=$2
|
||||
shift
|
||||
;;
|
||||
--armhf)
|
||||
BUILD_LIST+=("armhf")
|
||||
;;
|
||||
--amd64)
|
||||
BUILD_LIST+=("amd64")
|
||||
;;
|
||||
--i386)
|
||||
BUILD_LIST+=("i386")
|
||||
;;
|
||||
--aarch64)
|
||||
BUILD_LIST+=("aarch64")
|
||||
;;
|
||||
--all)
|
||||
BUILD_LIST=("armhf" "amd64" "i386" "aarch64")
|
||||
;;
|
||||
|
||||
*)
|
||||
echo "[WARN] $0 : Argument '$1' unknown will be Ignoring"
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
# Check if an architecture is available
|
||||
if [ "${#BUILD_LIST[@]}" -eq 0 ]; then
|
||||
echo "[ERROR] You need select an architecture for build!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
|
||||
#### Main ####
|
||||
|
||||
mkdir -p /data
|
||||
|
||||
# Setup docker env
|
||||
init_crosscompile
|
||||
start_docker
|
||||
|
||||
# Select arch build
|
||||
for arch in "${BUILD_LIST[@]}"; do
|
||||
(build_addon "$arch") &
|
||||
BUILD_TASKS+=($!)
|
||||
done
|
||||
|
||||
# Wait until all build jobs are done
|
||||
wait "${BUILD_TASKS[@]}"
|
||||
|
||||
# Cleanup docker env
|
||||
clean_crosscompile
|
||||
stop_docker
|
||||
|
||||
exit 0
|
||||
7
docker/platformio-esp8266.ini
Normal file
7
docker/platformio-esp8266.ini
Normal file
@@ -0,0 +1,7 @@
|
||||
; This file allows the docker build file to install the required platformio
|
||||
; platforms
|
||||
|
||||
[env:espressif8266]
|
||||
platform = espressif8266
|
||||
board = nodemcuv2
|
||||
framework = arduino
|
||||
@@ -1,12 +1,12 @@
|
||||
; This file allows the docker build file to install the required platformio
|
||||
; platforms
|
||||
|
||||
[env:espressif32]
|
||||
platform = espressif32
|
||||
board = nodemcu-32s
|
||||
framework = arduino
|
||||
|
||||
[env:espressif8266]
|
||||
platform = espressif8266
|
||||
board = nodemcuv2
|
||||
framework = arduino
|
||||
|
||||
[env:espressif32]
|
||||
platform = espressif32
|
||||
board = nodemcu-32s
|
||||
framework = arduino
|
||||
|
||||
33
esphomeyaml-beta/config.json
Normal file
33
esphomeyaml-beta/config.json
Normal file
@@ -0,0 +1,33 @@
|
||||
{
|
||||
"name": "esphomeyaml-beta",
|
||||
"version": "1.9.0b1",
|
||||
"slug": "esphomeyaml-beta",
|
||||
"description": "Beta version of esphomeyaml HassIO add-on.",
|
||||
"url": "https://esphomelib.com/esphomeyaml/index.html",
|
||||
"startup": "application",
|
||||
"webui": "http://[HOST]:[PORT:6052]",
|
||||
"boot": "auto",
|
||||
"ports": {
|
||||
"6052/tcp": 6052,
|
||||
"6053/tcp": 6053
|
||||
},
|
||||
"auto_uart": true,
|
||||
"map": [
|
||||
"config:rw"
|
||||
],
|
||||
"arch": [
|
||||
"amd64",
|
||||
"armhf",
|
||||
"i386"
|
||||
],
|
||||
"environment": {
|
||||
"ESPHOMEYAML_OTA_HOST_PORT": "6053"
|
||||
},
|
||||
"options": {
|
||||
"password": ""
|
||||
},
|
||||
"schema": {
|
||||
"password": "str?"
|
||||
},
|
||||
"image": "ottowinter/esphomeyaml-hassio-{arch}"
|
||||
}
|
||||
@@ -16,10 +16,25 @@ ARG BUILD_FROM
|
||||
# * disable platformio telemetry on install
|
||||
RUN /bin/bash -c "if [[ '$BUILD_FROM' = *\"ubuntu\"* ]]; then \
|
||||
apt-get update && apt-get install -y --no-install-recommends \
|
||||
python python-pip python-setuptools git && \
|
||||
python python-pip python-setuptools python-pil git && \
|
||||
rm -rf /var/lib/apt/lists/* /tmp/*; \
|
||||
else \
|
||||
apk add --no-cache python2 py2-pip git openssh libc6-compat; \
|
||||
apk add --no-cache \
|
||||
python2 \
|
||||
python2-dev \
|
||||
py2-pip \
|
||||
git \
|
||||
gcc \
|
||||
openssh \
|
||||
libc6-compat \
|
||||
jpeg-dev \
|
||||
zlib-dev \
|
||||
freetype-dev \
|
||||
lcms2-dev \
|
||||
openjpeg-dev \
|
||||
tiff-dev \
|
||||
libc-dev \
|
||||
linux-headers; \
|
||||
fi" && \
|
||||
pip install --no-cache-dir platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
@@ -38,6 +53,7 @@ RUN /bin/bash -c "if [[ '$BUILD_FROM' = *\"ubuntu\"* ]]; then \
|
||||
|
||||
# Install latest esphomeyaml from git
|
||||
RUN pip install --no-cache-dir \
|
||||
git+git://github.com/OttoWinter/esphomeyaml.git
|
||||
git+git://github.com/OttoWinter/esphomeyaml.git && \
|
||||
pip install --no-cache-dir pillow tzlocal
|
||||
|
||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
||||
|
||||
@@ -21,9 +21,13 @@
|
||||
"map": [
|
||||
"config:rw"
|
||||
],
|
||||
"options": {},
|
||||
"options": {
|
||||
"password": ""
|
||||
},
|
||||
"schema": {
|
||||
"password": "str?"
|
||||
},
|
||||
"environment": {
|
||||
"ESPHOMEYAML_OTA_HOST_PORT": "6053"
|
||||
},
|
||||
"schema": {}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,44 +0,0 @@
|
||||
# Dockerfile for HassIO add-on
|
||||
ARG BUILD_FROM=ubuntu:bionic
|
||||
FROM ${BUILD_FROM}
|
||||
|
||||
# Re-declare BUILD_FROM to fix weird docker issue
|
||||
ARG BUILD_FROM
|
||||
ARG BUILD_VERSION
|
||||
|
||||
# On amd64 and alike, using ubuntu as the base is better as building
|
||||
# for the ESP32 only works with glibc (and ubuntu). However, on armhf
|
||||
# the build toolchain frequently procudes segfaults under ubuntu.
|
||||
# -> Use ubuntu for most architectures, except alpine for armhf
|
||||
#
|
||||
# * python and related required because this is a python project
|
||||
# * git required for platformio library dependencies downloads
|
||||
# * libc6-compat and openssh required on alpine for weird reasons
|
||||
# * disable platformio telemetry on install
|
||||
RUN /bin/bash -c "if [[ '$BUILD_FROM' = *\"ubuntu\"* ]]; then \
|
||||
apt-get update && apt-get install -y --no-install-recommends \
|
||||
python python-pip python-setuptools git && \
|
||||
rm -rf /var/lib/apt/lists/* /tmp/*; \
|
||||
else \
|
||||
apk add --no-cache python2 py2-pip git openssh libc6-compat; \
|
||||
fi" && \
|
||||
pip install --no-cache-dir platformio && \
|
||||
platformio settings set enable_telemetry No
|
||||
|
||||
|
||||
# Create fake project to make platformio install all depdencies.
|
||||
# * Ignore build errors from platformio - empty project
|
||||
# * On alpine, only install ESP8266 toolchain
|
||||
COPY platformio.ini /pio/platformio.ini
|
||||
RUN /bin/bash -c "if [[ '$BUILD_FROM' = *\"ubuntu\"* ]]; then \
|
||||
platformio run -e espressif32 -e espressif8266 -d /pio; exit 0; \
|
||||
else \
|
||||
echo \"\$(head -8 /pio/platformio.ini)\" >/pio/platformio.ini; \
|
||||
platformio run -e espressif8266 -d /pio; exit 0; \
|
||||
fi"
|
||||
|
||||
# Install latest esphomeyaml from git
|
||||
RUN pip install --no-cache-dir \
|
||||
esphomeyaml==${BUILD_VERSION}
|
||||
|
||||
CMD ["esphomeyaml", "/config/esphomeyaml", "dashboard"]
|
||||
@@ -7,27 +7,22 @@ import random
|
||||
import sys
|
||||
from datetime import datetime
|
||||
|
||||
from esphomeyaml import const, core, mqtt, wizard, writer, yaml_util
|
||||
from esphomeyaml.config import core_to_code, get_component, iter_components, read_config
|
||||
from esphomeyaml.const import CONF_BAUD_RATE, CONF_DOMAIN, CONF_ESPHOMEYAML, CONF_HOSTNAME, \
|
||||
CONF_LOGGER, CONF_MANUAL_IP, CONF_NAME, CONF_STATIC_IP, CONF_WIFI, ESP_PLATFORM_ESP8266
|
||||
from esphomeyaml import const, core, core_config, mqtt, wizard, writer, yaml_util
|
||||
from esphomeyaml.config import get_component, iter_components, read_config
|
||||
from esphomeyaml.const import CONF_BAUD_RATE, CONF_BUILD_PATH, CONF_DOMAIN, CONF_ESPHOMEYAML, \
|
||||
CONF_HOSTNAME, CONF_LOGGER, CONF_MANUAL_IP, CONF_NAME, CONF_STATIC_IP, CONF_USE_CUSTOM_CODE, \
|
||||
CONF_WIFI, ESP_PLATFORM_ESP8266
|
||||
from esphomeyaml.core import ESPHomeYAMLError
|
||||
from esphomeyaml.helpers import AssignmentExpression, Expression, RawStatement, _EXPRESSIONS, add, \
|
||||
add_job, color, flush_tasks, indent, quote, statement
|
||||
from esphomeyaml.helpers import AssignmentExpression, Expression, RawStatement, \
|
||||
_EXPRESSIONS, add, \
|
||||
add_job, color, flush_tasks, indent, quote, statement, relative_path
|
||||
from esphomeyaml.util import safe_print
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
PRE_INITIALIZE = ['esphomeyaml', 'logger', 'wifi', 'ota', 'mqtt', 'web_server', 'i2c']
|
||||
|
||||
|
||||
def get_name(config):
|
||||
return config[CONF_ESPHOMEYAML][CONF_NAME]
|
||||
|
||||
|
||||
def get_base_path(config):
|
||||
return os.path.join(os.path.dirname(core.CONFIG_PATH), get_name(config))
|
||||
|
||||
|
||||
def get_serial_ports():
|
||||
# from https://github.com/pyserial/pyserial/blob/master/serial/tools/list_ports.py
|
||||
from serial.tools.list_ports import comports
|
||||
@@ -45,11 +40,11 @@ def choose_serial_port(config):
|
||||
|
||||
if not result:
|
||||
return 'OTA'
|
||||
print(u"Found multiple serial port options, please choose one:")
|
||||
safe_print(u"Found multiple serial port options, please choose one:")
|
||||
for i, (res, desc) in enumerate(result):
|
||||
print(u" [{}] {} ({})".format(i, res, desc))
|
||||
print(u" [{}] Over The Air ({})".format(len(result), get_upload_host(config)))
|
||||
print()
|
||||
safe_print(u" [{}] {} ({})".format(i, res, desc))
|
||||
safe_print(u" [{}] Over The Air ({})".format(len(result), get_upload_host(config)))
|
||||
safe_print()
|
||||
while True:
|
||||
opt = raw_input('(number): ')
|
||||
if opt in result:
|
||||
@@ -61,7 +56,7 @@ def choose_serial_port(config):
|
||||
raise ValueError
|
||||
break
|
||||
except ValueError:
|
||||
print(color('red', u"Invalid option: '{}'".format(opt)))
|
||||
safe_print(color('red', u"Invalid option: '{}'".format(opt)))
|
||||
if opt == len(result):
|
||||
return 'OTA'
|
||||
return result[opt][0]
|
||||
@@ -97,23 +92,33 @@ def run_platformio(*cmd, **kwargs):
|
||||
|
||||
def run_miniterm(config, port, escape=False):
|
||||
import serial
|
||||
baud_rate = config.get(CONF_LOGGER, {}).get(CONF_BAUD_RATE, 115200)
|
||||
if CONF_LOGGER not in config:
|
||||
_LOGGER.info("Logger is not enabled. Not starting UART logs.")
|
||||
return
|
||||
baud_rate = config['logger'][CONF_BAUD_RATE]
|
||||
if baud_rate == 0:
|
||||
_LOGGER.info("UART logging is disabled (baud_rate=0). Not starting UART logs.")
|
||||
_LOGGER.info("Starting log output from %s with baud rate %s", port, baud_rate)
|
||||
|
||||
with serial.Serial(port, baudrate=baud_rate) as ser:
|
||||
while True:
|
||||
line = ser.readline()
|
||||
try:
|
||||
raw = ser.readline()
|
||||
except serial.SerialException:
|
||||
_LOGGER.error("Serial port closed!")
|
||||
return
|
||||
line = raw.replace('\r', '').replace('\n', '')
|
||||
time = datetime.now().time().strftime('[%H:%M:%S]')
|
||||
message = time + line.decode('unicode-escape').replace('\r', '').replace('\n', '')
|
||||
message = time + line
|
||||
if escape:
|
||||
message = message.replace('\033', '\\033').encode('ascii', 'replace')
|
||||
print(message)
|
||||
message = message.replace('\033', '\\033')
|
||||
safe_print(message)
|
||||
|
||||
|
||||
def write_cpp(config):
|
||||
_LOGGER.info("Generating C++ source...")
|
||||
|
||||
add_job(core_to_code, config[CONF_ESPHOMEYAML], domain='esphomeyaml')
|
||||
add_job(core_config.to_code, config[CONF_ESPHOMEYAML], domain='esphomeyaml')
|
||||
for domain in PRE_INITIALIZE:
|
||||
if domain == CONF_ESPHOMEYAML or domain not in config:
|
||||
continue
|
||||
@@ -129,7 +134,7 @@ def write_cpp(config):
|
||||
add(RawStatement(''))
|
||||
all_code = []
|
||||
for exp in _EXPRESSIONS:
|
||||
if core.SIMPLIFY:
|
||||
if not config[CONF_ESPHOMEYAML][CONF_USE_CUSTOM_CODE]:
|
||||
if isinstance(exp, Expression) and not exp.required:
|
||||
continue
|
||||
if isinstance(exp, AssignmentExpression) and not exp.obj.required:
|
||||
@@ -138,19 +143,19 @@ def write_cpp(config):
|
||||
exp = exp.rhs
|
||||
all_code.append(unicode(statement(exp)))
|
||||
|
||||
platformio_ini_s = writer.get_ini_content(config)
|
||||
ini_path = os.path.join(get_base_path(config), 'platformio.ini')
|
||||
writer.write_platformio_ini(platformio_ini_s, ini_path)
|
||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
||||
writer.write_platformio_project(config, build_path)
|
||||
|
||||
code_s = indent('\n'.join(line.rstrip() for line in all_code))
|
||||
cpp_path = os.path.join(get_base_path(config), 'src', 'main.cpp')
|
||||
cpp_path = os.path.join(build_path, 'src', 'main.cpp')
|
||||
writer.write_cpp(code_s, cpp_path)
|
||||
return 0
|
||||
|
||||
|
||||
def compile_program(args, config):
|
||||
_LOGGER.info("Compiling app...")
|
||||
command = ['platformio', 'run', '-d', get_base_path(config)]
|
||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
||||
command = ['platformio', 'run', '-d', build_path]
|
||||
if args.verbose:
|
||||
command.append('-v')
|
||||
return run_platformio(*command)
|
||||
@@ -169,8 +174,8 @@ def get_upload_host(config):
|
||||
def upload_using_esptool(config, port):
|
||||
import esptool
|
||||
|
||||
name = get_name(config)
|
||||
path = os.path.join(get_base_path(config), '.pioenvs', name, 'firmware.bin')
|
||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
||||
path = os.path.join(build_path, '.pioenvs', core.NAME, 'firmware.bin')
|
||||
# pylint: disable=protected-access
|
||||
return run_platformio('esptool.py', '--before', 'default_reset', '--after', 'hard_reset',
|
||||
'--chip', 'esp8266', '--port', port, 'write_flash', '0x0',
|
||||
@@ -178,11 +183,14 @@ def upload_using_esptool(config, port):
|
||||
|
||||
|
||||
def upload_program(config, args, port):
|
||||
_LOGGER.info("Uploading binary...")
|
||||
if port != 'OTA':
|
||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
||||
|
||||
# if upload is to a serial port use platformio, otherwise assume ota
|
||||
serial_port = port.startswith('/') or port.startswith('COM')
|
||||
if port != 'OTA' and serial_port:
|
||||
if core.ESP_PLATFORM == ESP_PLATFORM_ESP8266 and args.use_esptoolpy:
|
||||
return upload_using_esptool(config, port)
|
||||
command = ['platformio', 'run', '-d', get_base_path(config),
|
||||
command = ['platformio', 'run', '-d', build_path,
|
||||
'-t', 'upload', '--upload-port', port]
|
||||
if args.verbose:
|
||||
command.append('-v')
|
||||
@@ -192,24 +200,36 @@ def upload_program(config, args, port):
|
||||
_LOGGER.error("No serial port found and OTA not enabled. Can't upload!")
|
||||
return -1
|
||||
|
||||
host = get_upload_host(config)
|
||||
# If hostname/ip is explicitly provided as upload-port argument, use this instead of zeroconf
|
||||
# hostname. This is to support use cases where zeroconf (hostname.local) does not work.
|
||||
if port != 'OTA':
|
||||
host = port
|
||||
else:
|
||||
host = get_upload_host(config)
|
||||
|
||||
from esphomeyaml.components import ota
|
||||
from esphomeyaml import espota
|
||||
from esphomeyaml import espota2
|
||||
|
||||
bin_file = os.path.join(get_base_path(config), '.pioenvs', get_name(config), 'firmware.bin')
|
||||
bin_file = os.path.join(build_path, '.pioenvs', core.NAME, 'firmware.bin')
|
||||
if args.host_port is not None:
|
||||
host_port = args.host_port
|
||||
else:
|
||||
host_port = int(os.getenv('ESPHOMEYAML_OTA_HOST_PORT', random.randint(10000, 60000)))
|
||||
espota_args = ['espota.py', '--debug', '--progress', '-i', host,
|
||||
'-p', str(ota.get_port(config)), '-f', bin_file,
|
||||
'-a', ota.get_auth(config), '-P', str(host_port)]
|
||||
return espota.main(espota_args)
|
||||
|
||||
verbose = args.verbose
|
||||
remote_port = ota.get_port(config)
|
||||
password = ota.get_auth(config)
|
||||
|
||||
res = espota2.run_ota(host, remote_port, password, bin_file)
|
||||
if res == 0:
|
||||
return res
|
||||
_LOGGER.warn("OTA v2 method failed. Trying with legacy OTA...")
|
||||
return espota2.run_legacy_ota(verbose, host_port, host, remote_port, password, bin_file)
|
||||
|
||||
|
||||
def show_logs(config, args, port, escape=False):
|
||||
if port != 'OTA':
|
||||
serial_port = port.startswith('/') or port.startswith('COM')
|
||||
if port != 'OTA' and serial_port:
|
||||
run_miniterm(config, port, escape=escape)
|
||||
return 0
|
||||
return mqtt.show_logs(config, args.topic, args.username, args.password, args.client_id,
|
||||
@@ -273,7 +293,7 @@ def strip_default_ids(config):
|
||||
def command_config(args, config):
|
||||
if not args.verbose:
|
||||
config = strip_default_ids(config)
|
||||
print(yaml_util.dump(config))
|
||||
safe_print(yaml_util.dump(config))
|
||||
return 0
|
||||
|
||||
|
||||
@@ -281,6 +301,9 @@ def command_compile(args, config):
|
||||
exit_code = write_cpp(config)
|
||||
if exit_code != 0:
|
||||
return exit_code
|
||||
if args.only_generate:
|
||||
_LOGGER.info(u"Successfully generated source code.")
|
||||
return 0
|
||||
exit_code = compile_program(args, config)
|
||||
if exit_code != 0:
|
||||
return exit_code
|
||||
@@ -329,7 +352,18 @@ def command_mqtt_fingerprint(args, config):
|
||||
|
||||
|
||||
def command_version(args):
|
||||
print(u"Version: {}".format(const.__version__))
|
||||
safe_print(u"Version: {}".format(const.__version__))
|
||||
return 0
|
||||
|
||||
|
||||
def command_clean(args, config):
|
||||
build_path = relative_path(config[CONF_ESPHOMEYAML][CONF_BUILD_PATH])
|
||||
try:
|
||||
writer.clean_build(build_path)
|
||||
except OSError as err:
|
||||
_LOGGER.error("Error deleting build files: %s", err)
|
||||
return 1
|
||||
_LOGGER.info("Done!")
|
||||
return 0
|
||||
|
||||
|
||||
@@ -353,6 +387,7 @@ POST_CONFIG_ACTIONS = {
|
||||
'run': command_run,
|
||||
'clean-mqtt': command_clean_mqtt,
|
||||
'mqtt-fingerprint': command_mqtt_fingerprint,
|
||||
'clean': command_clean,
|
||||
}
|
||||
|
||||
|
||||
@@ -366,7 +401,11 @@ def parse_args(argv):
|
||||
subparsers.required = True
|
||||
subparsers.add_parser('config', help='Validate the configuration and spit it out.')
|
||||
|
||||
subparsers.add_parser('compile', help='Read the configuration and compile a program.')
|
||||
parser_compile = subparsers.add_parser('compile',
|
||||
help='Read the configuration and compile a program.')
|
||||
parser_compile.add_argument('--only-generate',
|
||||
help="Only generate source code, do not compile.",
|
||||
action='store_true')
|
||||
|
||||
parser_upload = subparsers.add_parser('upload', help='Validate the configuration '
|
||||
'and upload the latest binary.')
|
||||
@@ -390,7 +429,7 @@ def parse_args(argv):
|
||||
|
||||
parser_run = subparsers.add_parser('run', help='Validate the configuration, create a binary, '
|
||||
'upload it, and start MQTT logs.')
|
||||
parser_run.add_argument('--upload-port', help="Manually specify the upload port to use. "
|
||||
parser_run.add_argument('--upload-port', help="Manually specify the upload port/ip to use. "
|
||||
"For example /dev/cu.SLAB_USBtoUART.")
|
||||
parser_run.add_argument('--host-port', help="Specify the host port to use for OTA", type=int)
|
||||
parser_run.add_argument('--no-logs', help='Disable starting MQTT logs.',
|
||||
@@ -419,10 +458,16 @@ def parse_args(argv):
|
||||
|
||||
subparsers.add_parser('version', help="Print the esphomeyaml version and exit.")
|
||||
|
||||
subparsers.add_parser('clean', help="Delete all temporary build files.")
|
||||
|
||||
dashboard = subparsers.add_parser('dashboard',
|
||||
help="Create a simple webserver for a dashboard.")
|
||||
help="Create a simple web server for a dashboard.")
|
||||
dashboard.add_argument("--port", help="The HTTP port to open connections on.", type=int,
|
||||
default=6052)
|
||||
dashboard.add_argument("--password", help="The optional password to require for all requests.",
|
||||
type=str, default='')
|
||||
dashboard.add_argument("--open-ui", help="Open the dashboard UI in a browser.",
|
||||
action='store_true')
|
||||
|
||||
return parser.parse_args(argv[1:])
|
||||
|
||||
@@ -449,7 +494,7 @@ def run_esphomeyaml(argv):
|
||||
except ESPHomeYAMLError as e:
|
||||
_LOGGER.error(e)
|
||||
return 1
|
||||
print(u"Unknown command {}".format(args.command))
|
||||
safe_print(u"Unknown command {}".format(args.command))
|
||||
return 1
|
||||
|
||||
|
||||
|
||||
@@ -1,121 +1,69 @@
|
||||
import copy
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml import core
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import cover, fan
|
||||
from esphomeyaml.const import CONF_ABOVE, CONF_ACTION_ID, CONF_AND, CONF_AUTOMATION_ID, \
|
||||
CONF_BELOW, \
|
||||
CONF_BLUE, CONF_BRIGHTNESS, CONF_CONDITION_ID, CONF_DELAY, CONF_EFFECT, CONF_FLASH_LENGTH, \
|
||||
CONF_GREEN, CONF_ID, CONF_IF, CONF_LAMBDA, CONF_OR, CONF_OSCILLATING, CONF_PAYLOAD, CONF_QOS, \
|
||||
CONF_RANGE, CONF_RED, CONF_RETAIN, CONF_SPEED, CONF_THEN, CONF_TOPIC, CONF_TRANSITION_LENGTH, \
|
||||
CONF_TRIGGER_ID, CONF_WHITE
|
||||
CONF_BELOW, CONF_CONDITION, CONF_CONDITION_ID, CONF_DELAY, \
|
||||
CONF_ELSE, CONF_ID, CONF_IF, CONF_LAMBDA, \
|
||||
CONF_OR, CONF_RANGE, CONF_THEN, CONF_TRIGGER_ID
|
||||
from esphomeyaml.core import ESPHomeYAMLError
|
||||
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, TemplateArguments, add, \
|
||||
bool_, esphomelib_ns, float_, get_variable, process_lambda, std_string, templatable, uint32, \
|
||||
uint8, add_job
|
||||
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, TemplateArguments, add, add_job, \
|
||||
esphomelib_ns, float_, process_lambda, templatable, uint32, get_variable
|
||||
from esphomeyaml.util import ServiceRegistry
|
||||
|
||||
CONF_MQTT_PUBLISH = 'mqtt.publish'
|
||||
CONF_LIGHT_TOGGLE = 'light.toggle'
|
||||
CONF_LIGHT_TURN_OFF = 'light.turn_off'
|
||||
CONF_LIGHT_TURN_ON = 'light.turn_on'
|
||||
CONF_SWITCH_TOGGLE = 'switch.toggle'
|
||||
CONF_SWITCH_TURN_OFF = 'switch.turn_off'
|
||||
CONF_SWITCH_TURN_ON = 'switch.turn_on'
|
||||
CONF_COVER_OPEN = 'cover.open'
|
||||
CONF_COVER_CLOSE = 'cover.close'
|
||||
CONF_COVER_STOP = 'cover.stop'
|
||||
CONF_FAN_TOGGLE = 'fan.toggle'
|
||||
CONF_FAN_TURN_OFF = 'fan.turn_off'
|
||||
CONF_FAN_TURN_ON = 'fan.turn_on'
|
||||
|
||||
ACTION_KEYS = [CONF_DELAY, CONF_MQTT_PUBLISH, CONF_LIGHT_TOGGLE, CONF_LIGHT_TURN_OFF,
|
||||
CONF_LIGHT_TURN_ON, CONF_SWITCH_TOGGLE, CONF_SWITCH_TURN_OFF, CONF_SWITCH_TURN_ON,
|
||||
CONF_LAMBDA, CONF_COVER_OPEN, CONF_COVER_CLOSE, CONF_COVER_STOP, CONF_FAN_TOGGLE,
|
||||
CONF_FAN_TURN_OFF, CONF_FAN_TURN_ON]
|
||||
def maybe_simple_id(*validators):
|
||||
validator = vol.All(*validators)
|
||||
|
||||
ACTIONS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
|
||||
cv.GenerateID(CONF_ACTION_ID): cv.declare_variable_id(None),
|
||||
vol.Optional(CONF_DELAY): cv.templatable(cv.positive_time_period_milliseconds),
|
||||
vol.Optional(CONF_MQTT_PUBLISH): vol.Schema({
|
||||
vol.Required(CONF_TOPIC): cv.templatable(cv.publish_topic),
|
||||
vol.Required(CONF_PAYLOAD): cv.templatable(cv.mqtt_payload),
|
||||
vol.Optional(CONF_QOS): cv.templatable(cv.mqtt_qos),
|
||||
vol.Optional(CONF_RETAIN): cv.templatable(cv.boolean),
|
||||
}),
|
||||
vol.Optional(CONF_LIGHT_TOGGLE): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
|
||||
}),
|
||||
vol.Optional(CONF_LIGHT_TURN_OFF): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
|
||||
}),
|
||||
vol.Optional(CONF_LIGHT_TURN_ON): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Exclusive(CONF_TRANSITION_LENGTH, 'transformer'):
|
||||
cv.templatable(cv.positive_time_period_milliseconds),
|
||||
vol.Exclusive(CONF_FLASH_LENGTH, 'transformer'):
|
||||
cv.templatable(cv.positive_time_period_milliseconds),
|
||||
vol.Optional(CONF_BRIGHTNESS): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_RED): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_GREEN): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_BLUE): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_WHITE): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_EFFECT): cv.templatable(cv.string),
|
||||
}),
|
||||
vol.Optional(CONF_SWITCH_TOGGLE): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_SWITCH_TURN_OFF): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_SWITCH_TURN_ON): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_COVER_OPEN): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_COVER_CLOSE): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_COVER_STOP): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_COVER_OPEN): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_COVER_CLOSE): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_COVER_STOP): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_FAN_TOGGLE): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_FAN_TURN_OFF): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
}),
|
||||
vol.Optional(CONF_FAN_TURN_ON): vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_OSCILLATING): cv.templatable(cv.boolean),
|
||||
vol.Optional(CONF_SPEED): cv.templatable(fan.validate_fan_speed),
|
||||
}),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
}, cv.has_exactly_one_key(*ACTION_KEYS))])
|
||||
def validate(value):
|
||||
if isinstance(value, dict):
|
||||
return validator(value)
|
||||
return validator({CONF_ID: value})
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
DelayAction = esphomelib_ns.DelayAction
|
||||
LambdaAction = esphomelib_ns.LambdaAction
|
||||
Automation = esphomelib_ns.Automation
|
||||
return validate
|
||||
|
||||
|
||||
def validate_recursive_condition(value):
|
||||
return CONDITIONS_SCHEMA(value)
|
||||
|
||||
|
||||
CONDITION_KEYS = [CONF_AND, CONF_OR, CONF_RANGE, CONF_LAMBDA]
|
||||
def validate_recursive_action(value):
|
||||
value = cv.ensure_list(value)[:]
|
||||
for i, item in enumerate(value):
|
||||
item = copy.deepcopy(item)
|
||||
if not isinstance(item, dict):
|
||||
raise vol.Invalid(u"Action must consist of key-value mapping! Got {}".format(item))
|
||||
key = next((x for x in item if x != CONF_ACTION_ID), None)
|
||||
if key is None:
|
||||
raise vol.Invalid(u"Key missing from action! Got {}".format(item))
|
||||
if key not in ACTION_REGISTRY:
|
||||
raise vol.Invalid(u"Unable to find action with the name '{}', is the component loaded?"
|
||||
u"".format(key))
|
||||
item.setdefault(CONF_ACTION_ID, None)
|
||||
key2 = next((x for x in item if x != CONF_ACTION_ID and x != key), None)
|
||||
if key2 is not None:
|
||||
raise vol.Invalid(u"Cannot have two actions in one item. Key {} overrides {}!"
|
||||
u"".format(key, key2))
|
||||
validator = ACTION_REGISTRY[key][0]
|
||||
value[i] = {
|
||||
CONF_ACTION_ID: cv.declare_variable_id(None)(item[CONF_ACTION_ID]),
|
||||
key: validator(item[key])
|
||||
}
|
||||
return value
|
||||
|
||||
CONDITIONS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
|
||||
|
||||
ACTION_REGISTRY = ServiceRegistry()
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
DelayAction = esphomelib_ns.DelayAction
|
||||
LambdaAction = esphomelib_ns.LambdaAction
|
||||
IfAction = esphomelib_ns.IfAction
|
||||
UpdateComponentAction = esphomelib_ns.UpdateComponentAction
|
||||
Automation = esphomelib_ns.Automation
|
||||
|
||||
CONDITIONS_SCHEMA = vol.All(cv.ensure_list, [cv.templatable({
|
||||
cv.GenerateID(CONF_CONDITION_ID): cv.declare_variable_id(None),
|
||||
vol.Optional(CONF_AND): validate_recursive_condition,
|
||||
vol.Optional(CONF_OR): validate_recursive_condition,
|
||||
@@ -124,7 +72,7 @@ CONDITIONS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
|
||||
vol.Optional(CONF_BELOW): vol.Coerce(float),
|
||||
}), cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW)),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
}), cv.has_exactly_one_key(*CONDITION_KEYS)])
|
||||
})])
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
AndCondition = esphomelib_ns.AndCondition
|
||||
@@ -132,17 +80,57 @@ OrCondition = esphomelib_ns.OrCondition
|
||||
RangeCondition = esphomelib_ns.RangeCondition
|
||||
LambdaCondition = esphomelib_ns.LambdaCondition
|
||||
|
||||
|
||||
def validate_automation(extra_schema=None, extra_validators=None, single=False):
|
||||
schema = AUTOMATION_SCHEMA.extend(extra_schema or {})
|
||||
|
||||
def validator_(value):
|
||||
if isinstance(value, list):
|
||||
try:
|
||||
# First try as a sequence of actions
|
||||
return [schema({CONF_THEN: value})]
|
||||
except vol.Invalid as err:
|
||||
# Next try as a sequence of automations
|
||||
try:
|
||||
return vol.Schema([schema])(value)
|
||||
except vol.Invalid as err2:
|
||||
raise vol.MultipleInvalid([err, err2])
|
||||
elif isinstance(value, dict):
|
||||
if CONF_THEN in value:
|
||||
return [schema(value)]
|
||||
return [schema({CONF_THEN: value})]
|
||||
# This should only happen with invalid configs, but let's have a nice error message.
|
||||
return [schema(value)]
|
||||
|
||||
def validator(value):
|
||||
value = validator_(value)
|
||||
if extra_validators is not None:
|
||||
value = vol.Schema([extra_validators])(value)
|
||||
if single:
|
||||
if len(value) != 1:
|
||||
raise vol.Invalid("Cannot have more than 1 automation for templates")
|
||||
return value[0]
|
||||
return value
|
||||
|
||||
return validator
|
||||
|
||||
|
||||
AUTOMATION_SCHEMA = vol.Schema({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(None),
|
||||
cv.GenerateID(CONF_AUTOMATION_ID): cv.declare_variable_id(None),
|
||||
vol.Optional(CONF_IF): CONDITIONS_SCHEMA,
|
||||
vol.Required(CONF_THEN): ACTIONS_SCHEMA,
|
||||
vol.Required(CONF_THEN): validate_recursive_action,
|
||||
})
|
||||
|
||||
|
||||
def build_condition(config, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
if CONF_AND in config:
|
||||
if isinstance(config, core.Lambda):
|
||||
lambda_ = None
|
||||
for lambda_ in process_lambda(config, [(arg_type, 'x')]):
|
||||
yield
|
||||
yield LambdaCondition.new(template_arg, lambda_)
|
||||
elif CONF_AND in config:
|
||||
yield AndCondition.new(template_arg, build_conditions(config[CONF_AND], template_arg))
|
||||
elif CONF_OR in config:
|
||||
yield OrCondition.new(template_arg, build_conditions(config[CONF_OR], template_arg))
|
||||
@@ -181,203 +169,84 @@ def build_conditions(config, arg_type):
|
||||
yield ArrayInitializer(*conditions)
|
||||
|
||||
|
||||
def build_action(config, arg_type):
|
||||
from esphomeyaml.components import light, mqtt, switch
|
||||
DELAY_ACTION_SCHEMA = cv.templatable(cv.positive_time_period_milliseconds)
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_DELAY, DELAY_ACTION_SCHEMA)
|
||||
def delay_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
# Keep pylint from freaking out
|
||||
var = None
|
||||
if CONF_DELAY in config:
|
||||
rhs = App.register_component(DelayAction.new(template_arg))
|
||||
type = DelayAction.template(template_arg)
|
||||
action = Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
template_ = None
|
||||
for template_ in templatable(config[CONF_DELAY], arg_type, uint32):
|
||||
yield
|
||||
add(action.set_delay(template_))
|
||||
yield action
|
||||
elif CONF_LAMBDA in config:
|
||||
lambda_ = None
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(arg_type, 'x')]):
|
||||
yield None
|
||||
rhs = LambdaAction.new(template_arg, lambda_)
|
||||
type = LambdaAction.template(template_arg)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_MQTT_PUBLISH in config:
|
||||
conf = config[CONF_MQTT_PUBLISH]
|
||||
rhs = App.Pget_mqtt_client().Pmake_publish_action()
|
||||
type = mqtt.MQTTPublishAction.template(template_arg)
|
||||
action = Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_TOPIC], arg_type, std_string):
|
||||
yield None
|
||||
add(action.set_topic(template_))
|
||||
rhs = App.register_component(DelayAction.new(template_arg))
|
||||
type = DelayAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config, arg_type, uint32):
|
||||
yield
|
||||
add(action.set_delay(template_))
|
||||
yield action
|
||||
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_PAYLOAD], arg_type, std_string):
|
||||
|
||||
IF_ACTION_SCHEMA = vol.All({
|
||||
vol.Required(CONF_CONDITION): validate_recursive_condition,
|
||||
vol.Optional(CONF_THEN): validate_recursive_action,
|
||||
vol.Optional(CONF_ELSE): validate_recursive_action,
|
||||
}, cv.has_at_least_one_key(CONF_THEN, CONF_ELSE))
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_IF, IF_ACTION_SCHEMA)
|
||||
def if_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for conditions in build_conditions(config[CONF_CONDITION], arg_type):
|
||||
yield None
|
||||
rhs = IfAction.new(template_arg, conditions)
|
||||
type = IfAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
if CONF_THEN in config:
|
||||
for actions in build_actions(config[CONF_THEN], arg_type):
|
||||
yield None
|
||||
add(action.set_payload(template_))
|
||||
if CONF_QOS in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_QOS], arg_type, uint8):
|
||||
yield
|
||||
add(action.set_qos(template_))
|
||||
if CONF_RETAIN in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_RETAIN], arg_type, bool_):
|
||||
yield None
|
||||
add(action.set_retain(template_))
|
||||
yield action
|
||||
elif CONF_LIGHT_TOGGLE in config:
|
||||
conf = config[CONF_LIGHT_TOGGLE]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
add(action.add_then(actions))
|
||||
if CONF_ELSE in config:
|
||||
for actions in build_actions(config[CONF_ELSE], arg_type):
|
||||
yield None
|
||||
rhs = var.make_toggle_action(template_arg)
|
||||
type = light.ToggleAction.template(template_arg)
|
||||
action = Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
if CONF_TRANSITION_LENGTH in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_TRANSITION_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_transition_length(template_))
|
||||
yield action
|
||||
elif CONF_LIGHT_TURN_OFF in config:
|
||||
conf = config[CONF_LIGHT_TURN_OFF]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_off_action(template_arg)
|
||||
type = light.TurnOffAction.template(template_arg)
|
||||
action = Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
if CONF_TRANSITION_LENGTH in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_TRANSITION_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_transition_length(template_))
|
||||
yield action
|
||||
elif CONF_LIGHT_TURN_ON in config:
|
||||
conf = config[CONF_LIGHT_TURN_ON]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_on_action(template_arg)
|
||||
type = light.TurnOnAction.template(template_arg)
|
||||
action = Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
if CONF_TRANSITION_LENGTH in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_TRANSITION_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_transition_length(template_))
|
||||
if CONF_FLASH_LENGTH in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_FLASH_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_flash_length(template_))
|
||||
if CONF_BRIGHTNESS in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_BRIGHTNESS], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_brightness(template_))
|
||||
if CONF_RED in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_RED], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_red(template_))
|
||||
if CONF_GREEN in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_GREEN], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_green(template_))
|
||||
if CONF_BLUE in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_BLUE], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_blue(template_))
|
||||
if CONF_WHITE in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_WHITE], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_white(template_))
|
||||
if CONF_EFFECT in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_EFFECT], arg_type, std_string):
|
||||
yield None
|
||||
add(action.set_effect(template_))
|
||||
yield action
|
||||
elif CONF_SWITCH_TOGGLE in config:
|
||||
conf = config[CONF_SWITCH_TOGGLE]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_toggle_action(template_arg)
|
||||
type = switch.ToggleAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_SWITCH_TURN_OFF in config:
|
||||
conf = config[CONF_SWITCH_TURN_OFF]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_off_action(template_arg)
|
||||
type = switch.TurnOffAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_SWITCH_TURN_ON in config:
|
||||
conf = config[CONF_SWITCH_TURN_ON]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_on_action(template_arg)
|
||||
type = switch.TurnOnAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_COVER_OPEN in config:
|
||||
conf = config[CONF_COVER_OPEN]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_open_action(template_arg)
|
||||
type = cover.OpenAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_COVER_CLOSE in config:
|
||||
conf = config[CONF_COVER_CLOSE]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_close_action(template_arg)
|
||||
type = cover.CloseAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_COVER_STOP in config:
|
||||
conf = config[CONF_COVER_STOP]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_stop_action(template_arg)
|
||||
type = cover.StopAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_FAN_TOGGLE in config:
|
||||
conf = config[CONF_FAN_TOGGLE]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_toggle_action(template_arg)
|
||||
type = fan.ToggleAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_FAN_TURN_OFF in config:
|
||||
conf = config[CONF_FAN_TURN_OFF]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_off_action(template_arg)
|
||||
type = fan.TurnOffAction.template(arg_type)
|
||||
yield Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
elif CONF_FAN_TURN_ON in config:
|
||||
conf = config[CONF_FAN_TURN_ON]
|
||||
for var in get_variable(conf[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_on_action(template_arg)
|
||||
type = fan.TurnOnAction.template(arg_type)
|
||||
action = Pvariable(config[CONF_ACTION_ID], rhs, type=type)
|
||||
if CONF_OSCILLATING in config:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_OSCILLATING], arg_type, bool_):
|
||||
yield None
|
||||
add(action.set_oscillating(template_))
|
||||
if CONF_SPEED in config:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_SPEED], arg_type, fan.FanSpeed):
|
||||
yield None
|
||||
add(action.set_speed(template_))
|
||||
yield action
|
||||
else:
|
||||
raise ESPHomeYAMLError(u"Unsupported action {}".format(config))
|
||||
add(action.add_else(actions))
|
||||
yield action
|
||||
|
||||
|
||||
LAMBDA_ACTION_SCHEMA = cv.lambda_
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_LAMBDA, LAMBDA_ACTION_SCHEMA)
|
||||
def lambda_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for lambda_ in process_lambda(config, [(arg_type, 'x')]):
|
||||
yield None
|
||||
rhs = LambdaAction.new(template_arg, lambda_)
|
||||
type = LambdaAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_COMPONENT_UPDATE = 'component.update'
|
||||
COMPONENT_UPDATE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_COMPONENT_UPDATE, COMPONENT_UPDATE_ACTION_SCHEMA)
|
||||
def component_update_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = UpdateComponentAction.new(var)
|
||||
type = UpdateComponentAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
def build_action(full_config, arg_type):
|
||||
action_id = full_config[CONF_ACTION_ID]
|
||||
key, config = next((k, v) for k, v in full_config.items() if k in ACTION_REGISTRY)
|
||||
|
||||
builder = ACTION_REGISTRY[key][1]
|
||||
for result in builder(config, action_id, arg_type):
|
||||
yield None
|
||||
yield result
|
||||
|
||||
|
||||
def build_actions(config, arg_type):
|
||||
@@ -387,22 +256,23 @@ def build_actions(config, arg_type):
|
||||
for action in build_action(conf, arg_type):
|
||||
yield None
|
||||
actions.append(action)
|
||||
yield ArrayInitializer(*actions)
|
||||
yield ArrayInitializer(*actions, multiline=False)
|
||||
|
||||
|
||||
def build_automation_(trigger, arg_type, config):
|
||||
rhs = App.make_automation(trigger)
|
||||
rhs = App.make_automation(TemplateArguments(arg_type), trigger)
|
||||
type = Automation.template(arg_type)
|
||||
obj = Pvariable(config[CONF_AUTOMATION_ID], rhs, type=type)
|
||||
if CONF_IF in config:
|
||||
conditions = None
|
||||
for conditions in build_conditions(config[CONF_IF], arg_type):
|
||||
yield
|
||||
yield None
|
||||
add(obj.add_conditions(conditions))
|
||||
actions = None
|
||||
for actions in build_actions(config[CONF_THEN], arg_type):
|
||||
yield
|
||||
yield None
|
||||
add(obj.add_actions(actions))
|
||||
yield obj
|
||||
|
||||
|
||||
def build_automation(trigger, arg_type, config):
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
{
|
||||
"squash": false,
|
||||
"build_from": {
|
||||
"aarch64": "arm64v8/ubuntu:bionic",
|
||||
"amd64": "ubuntu:bionic",
|
||||
"armhf": "homeassistant/armhf-base:latest",
|
||||
"i386": "i386/ubuntu:bionic"
|
||||
},
|
||||
"args": {}
|
||||
}
|
||||
@@ -2,11 +2,12 @@ import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import automation
|
||||
from esphomeyaml.const import CONF_DEVICE_CLASS, CONF_ID, CONF_INVERTED, CONF_MAX_LENGTH, \
|
||||
CONF_MIN_LENGTH, CONF_MQTT_ID, CONF_ON_CLICK, CONF_ON_DOUBLE_CLICK, CONF_ON_PRESS, \
|
||||
CONF_ON_RELEASE, CONF_TRIGGER_ID
|
||||
from esphomeyaml.helpers import App, NoArg, Pvariable, add, esphomelib_ns, setup_mqtt_component, \
|
||||
add_job
|
||||
from esphomeyaml.const import CONF_DEVICE_CLASS, CONF_ID, CONF_INTERNAL, CONF_INVERTED, \
|
||||
CONF_MAX_LENGTH, CONF_MIN_LENGTH, CONF_MQTT_ID, CONF_ON_CLICK, CONF_ON_DOUBLE_CLICK, \
|
||||
CONF_ON_PRESS, CONF_ON_RELEASE, CONF_TRIGGER_ID, CONF_FILTERS, CONF_INVERT, CONF_DELAYED_ON, \
|
||||
CONF_DELAYED_OFF, CONF_LAMBDA, CONF_HEARTBEAT
|
||||
from esphomeyaml.helpers import App, NoArg, Pvariable, add, add_job, esphomelib_ns, \
|
||||
setup_mqtt_component, bool_, process_lambda, ArrayInitializer
|
||||
|
||||
DEVICE_CLASSES = [
|
||||
'', 'battery', 'cold', 'connectivity', 'door', 'garage_door', 'gas',
|
||||
@@ -25,38 +26,94 @@ ReleaseTrigger = binary_sensor_ns.ReleaseTrigger
|
||||
ClickTrigger = binary_sensor_ns.ClickTrigger
|
||||
DoubleClickTrigger = binary_sensor_ns.DoubleClickTrigger
|
||||
BinarySensor = binary_sensor_ns.BinarySensor
|
||||
InvertFilter = binary_sensor_ns.InvertFilter
|
||||
LambdaFilter = binary_sensor_ns.LambdaFilter
|
||||
DelayedOnFilter = binary_sensor_ns.DelayedOnFilter
|
||||
DelayedOffFilter = binary_sensor_ns.DelayedOffFilter
|
||||
HeartbeatFilter = binary_sensor_ns.HeartbeatFilter
|
||||
MQTTBinarySensorComponent = binary_sensor_ns.MQTTBinarySensorComponent
|
||||
|
||||
FILTER_KEYS = [CONF_INVERT, CONF_DELAYED_ON, CONF_DELAYED_OFF, CONF_LAMBDA, CONF_HEARTBEAT]
|
||||
|
||||
FILTERS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
|
||||
vol.Optional(CONF_INVERT): None,
|
||||
vol.Optional(CONF_DELAYED_ON): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_DELAYED_OFF): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_HEARTBEAT): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
}, cv.has_exactly_one_key(*FILTER_KEYS))])
|
||||
|
||||
BINARY_SENSOR_SCHEMA = cv.MQTT_COMPONENT_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTBinarySensorComponent),
|
||||
cv.GenerateID(): cv.declare_variable_id(BinarySensor),
|
||||
vol.Optional(CONF_INVERTED): cv.boolean,
|
||||
|
||||
vol.Optional(CONF_DEVICE_CLASS): vol.All(vol.Lower, cv.one_of(*DEVICE_CLASSES)),
|
||||
vol.Optional(CONF_ON_PRESS): vol.All(cv.ensure_list, [automation.AUTOMATION_SCHEMA.extend({
|
||||
vol.Optional(CONF_FILTERS): FILTERS_SCHEMA,
|
||||
vol.Optional(CONF_ON_PRESS): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(PressTrigger),
|
||||
})]),
|
||||
vol.Optional(CONF_ON_RELEASE): vol.All(cv.ensure_list, [automation.AUTOMATION_SCHEMA.extend({
|
||||
}),
|
||||
vol.Optional(CONF_ON_RELEASE): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(ReleaseTrigger),
|
||||
})]),
|
||||
vol.Optional(CONF_ON_CLICK): vol.All(cv.ensure_list, [automation.AUTOMATION_SCHEMA.extend({
|
||||
}),
|
||||
vol.Optional(CONF_ON_CLICK): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(ClickTrigger),
|
||||
vol.Optional(CONF_MIN_LENGTH, default='50ms'): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_MAX_LENGTH, default='350ms'): cv.positive_time_period_milliseconds,
|
||||
})]),
|
||||
vol.Optional(CONF_ON_DOUBLE_CLICK):
|
||||
vol.All(cv.ensure_list, [automation.AUTOMATION_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(DoubleClickTrigger),
|
||||
vol.Optional(CONF_MIN_LENGTH, default='50ms'): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_MAX_LENGTH, default='350ms'): cv.positive_time_period_milliseconds,
|
||||
})]),
|
||||
}),
|
||||
vol.Optional(CONF_ON_DOUBLE_CLICK): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(DoubleClickTrigger),
|
||||
vol.Optional(CONF_MIN_LENGTH, default='50ms'): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_MAX_LENGTH, default='350ms'): cv.positive_time_period_milliseconds,
|
||||
}),
|
||||
|
||||
vol.Optional(CONF_INVERTED): cv.invalid(
|
||||
"The inverted binary_sensor property has been replaced by the "
|
||||
"new 'invert' binary sensor filter. Please see "
|
||||
"https://esphomelib.com/esphomeyaml/components/binary_sensor/index.html."
|
||||
),
|
||||
})
|
||||
|
||||
BINARY_SENSOR_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(BINARY_SENSOR_SCHEMA.schema)
|
||||
|
||||
|
||||
def setup_filter(config):
|
||||
if CONF_INVERT in config:
|
||||
yield InvertFilter.new()
|
||||
elif CONF_DELAYED_OFF in config:
|
||||
yield App.register_component(DelayedOffFilter.new(config[CONF_DELAYED_OFF]))
|
||||
elif CONF_DELAYED_ON in config:
|
||||
yield App.register_component(DelayedOnFilter.new(config[CONF_DELAYED_ON]))
|
||||
elif CONF_HEARTBEAT in config:
|
||||
yield App.register_component(HeartbeatFilter.new(config[CONF_HEARTBEAT]))
|
||||
elif CONF_LAMBDA in config:
|
||||
lambda_ = None
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(bool_, 'x')]):
|
||||
yield None
|
||||
yield LambdaFilter.new(lambda_)
|
||||
|
||||
|
||||
def setup_filters(config):
|
||||
filters = []
|
||||
for conf in config:
|
||||
filter = None
|
||||
for filter in setup_filter(conf):
|
||||
yield None
|
||||
filters.append(filter)
|
||||
yield ArrayInitializer(*filters)
|
||||
|
||||
|
||||
def setup_binary_sensor_core_(binary_sensor_var, mqtt_var, config):
|
||||
if CONF_INTERNAL in config:
|
||||
add(binary_sensor_var.set_internal(CONF_INTERNAL))
|
||||
if CONF_DEVICE_CLASS in config:
|
||||
add(binary_sensor_var.set_device_class(config[CONF_DEVICE_CLASS]))
|
||||
if CONF_INVERTED in config:
|
||||
add(binary_sensor_var.set_inverted(config[CONF_INVERTED]))
|
||||
if CONF_FILTERS in config:
|
||||
filters = None
|
||||
for filters in setup_filters(config[CONF_FILTERS]):
|
||||
yield
|
||||
add(binary_sensor_var.add_filters(filters))
|
||||
|
||||
for conf in config.get(CONF_ON_PRESS, []):
|
||||
rhs = binary_sensor_var.make_press_trigger()
|
||||
|
||||
@@ -1,48 +0,0 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.components.esp32_ble import ESP32BLETracker
|
||||
from esphomeyaml.const import CONF_MAC_ADDRESS, CONF_NAME, ESP_PLATFORM_ESP32
|
||||
from esphomeyaml.core import HexInt, MACAddress
|
||||
from esphomeyaml.helpers import ArrayInitializer, get_variable
|
||||
|
||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||
DEPENDENCIES = ['esp32_ble']
|
||||
|
||||
CONF_ESP32_BLE_ID = 'esp32_ble_id'
|
||||
|
||||
|
||||
def validate_mac(value):
|
||||
value = cv.string_strict(value)
|
||||
parts = value.split(':')
|
||||
if len(parts) != 6:
|
||||
raise vol.Invalid("MAC Address must consist of 6 : (colon) separated parts")
|
||||
parts_int = []
|
||||
if any(len(part) != 2 for part in parts):
|
||||
raise vol.Invalid("MAC Address must be format XX:XX:XX:XX:XX:XX")
|
||||
for part in parts:
|
||||
try:
|
||||
parts_int.append(int(part, 16))
|
||||
except ValueError:
|
||||
raise vol.Invalid("MAC Address parts must be hexadecimal values from 00 to FF")
|
||||
|
||||
return MACAddress(*parts_int)
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_MAC_ADDRESS): validate_mac,
|
||||
cv.GenerateID(CONF_ESP32_BLE_ID): cv.use_variable_id(ESP32BLETracker)
|
||||
}).extend(binary_sensor.BINARY_SENSOR_SCHEMA.schema)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
hub = None
|
||||
for hub in get_variable(CONF_ESP32_BLE_ID):
|
||||
yield
|
||||
addr = [HexInt(i) for i in config[CONF_MAC_ADDRESS].parts]
|
||||
rhs = hub.make_device(config[CONF_NAME], ArrayInitializer(*addr, multiline=False))
|
||||
binary_sensor.register_binary_sensor(rhs, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ESP32_BLE_TRACKER'
|
||||
23
esphomeyaml/components/binary_sensor/esp32_ble_tracker.py
Normal file
23
esphomeyaml/components/binary_sensor/esp32_ble_tracker.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.components.esp32_ble_tracker import CONF_ESP32_BLE_ID, ESP32BLETracker, \
|
||||
make_address_array
|
||||
from esphomeyaml.const import CONF_MAC_ADDRESS, CONF_NAME
|
||||
from esphomeyaml.helpers import get_variable
|
||||
|
||||
DEPENDENCIES = ['esp32_ble_tracker']
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_MAC_ADDRESS): cv.mac_address,
|
||||
cv.GenerateID(CONF_ESP32_BLE_ID): cv.use_variable_id(ESP32BLETracker)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
hub = None
|
||||
for hub in get_variable(config[CONF_ESP32_BLE_ID]):
|
||||
yield
|
||||
rhs = hub.make_presence_sensor(config[CONF_NAME], make_address_array(config[CONF_MAC_ADDRESS]))
|
||||
binary_sensor.register_binary_sensor(rhs, config)
|
||||
@@ -34,11 +34,11 @@ def validate_touch_pad(value):
|
||||
return value
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_PIN): validate_touch_pad,
|
||||
vol.Required(CONF_THRESHOLD): cv.uint16_t,
|
||||
cv.GenerateID(CONF_ESP32_TOUCH_ID): cv.use_variable_id(ESP32TouchComponent),
|
||||
}).extend(binary_sensor.BINARY_SENSOR_SCHEMA.schema)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -8,10 +8,10 @@ from esphomeyaml.helpers import App, gpio_input_pin_expression, variable, Applic
|
||||
|
||||
MakeGPIOBinarySensor = Application.MakeGPIOBinarySensor
|
||||
|
||||
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeGPIOBinarySensor),
|
||||
vol.Required(CONF_PIN): pins.GPIO_INPUT_PIN_SCHEMA
|
||||
}).extend(binary_sensor.BINARY_SENSOR_SCHEMA.schema)
|
||||
vol.Required(CONF_PIN): pins.gpio_input_pin_schema
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
26
esphomeyaml/components/binary_sensor/nextion.py
Normal file
26
esphomeyaml/components/binary_sensor/nextion.py
Normal file
@@ -0,0 +1,26 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.components.display.nextion import Nextion
|
||||
from esphomeyaml.const import CONF_COMPONENT_ID, CONF_NAME, CONF_PAGE_ID
|
||||
from esphomeyaml.helpers import get_variable
|
||||
|
||||
DEPENDENCIES = ['display']
|
||||
|
||||
CONF_NEXTION_ID = 'nextion_id'
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_PAGE_ID): cv.uint8_t,
|
||||
vol.Required(CONF_COMPONENT_ID): cv.uint8_t,
|
||||
cv.GenerateID(CONF_NEXTION_ID): cv.use_variable_id(Nextion)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
hub = None
|
||||
for hub in get_variable(config[CONF_NEXTION_ID]):
|
||||
yield
|
||||
rhs = hub.make_touch_component(config[CONF_NAME], config[CONF_PAGE_ID],
|
||||
config[CONF_COMPONENT_ID])
|
||||
binary_sensor.register_binary_sensor(rhs, config)
|
||||
42
esphomeyaml/components/binary_sensor/pn532.py
Normal file
42
esphomeyaml/components/binary_sensor/pn532.py
Normal file
@@ -0,0 +1,42 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.components.pn532 import PN532Component
|
||||
from esphomeyaml.const import CONF_NAME, CONF_UID
|
||||
from esphomeyaml.core import HexInt
|
||||
from esphomeyaml.helpers import ArrayInitializer, get_variable
|
||||
|
||||
DEPENDENCIES = ['pn532']
|
||||
|
||||
CONF_PN532_ID = 'pn532_id'
|
||||
|
||||
|
||||
def validate_uid(value):
|
||||
value = cv.string_strict(value)
|
||||
for x in value.split('-'):
|
||||
if len(x) != 2:
|
||||
raise vol.Invalid("Each part (separated by '-') of the UID must be two characters "
|
||||
"long.")
|
||||
try:
|
||||
x = int(x, 16)
|
||||
except ValueError:
|
||||
raise vol.Invalid("Valid characters for parts of a UID are 0123456789ABCDEF.")
|
||||
if x < 0 or x > 255:
|
||||
raise vol.Invalid("Valid values for UID parts (separated by '-') are 00 to FF")
|
||||
return value
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_UID): validate_uid,
|
||||
cv.GenerateID(CONF_PN532_ID): cv.use_variable_id(PN532Component)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
hub = None
|
||||
for hub in get_variable(config[CONF_PN532_ID]):
|
||||
yield
|
||||
addr = [HexInt(int(x, 16)) for x in config[CONF_UID].split('-')]
|
||||
rhs = hub.make_tag(config[CONF_NAME], ArrayInitializer(*addr, multiline=False))
|
||||
binary_sensor.register_binary_sensor(rhs, config)
|
||||
23
esphomeyaml/components/binary_sensor/rdm6300.py
Normal file
23
esphomeyaml/components/binary_sensor/rdm6300.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor, rdm6300
|
||||
from esphomeyaml.const import CONF_NAME, CONF_UID
|
||||
from esphomeyaml.helpers import get_variable
|
||||
|
||||
DEPENDENCIES = ['rdm6300']
|
||||
|
||||
CONF_RDM6300_ID = 'rdm6300_id'
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_UID): cv.uint32_t,
|
||||
cv.GenerateID(CONF_RDM6300_ID): cv.use_variable_id(rdm6300.RDM6300Component)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
hub = None
|
||||
for hub in get_variable(config[CONF_RDM6300_ID]):
|
||||
yield
|
||||
rhs = hub.make_card(config[CONF_NAME], config[CONF_UID])
|
||||
binary_sensor.register_binary_sensor(rhs, config)
|
||||
121
esphomeyaml/components/binary_sensor/remote_receiver.py
Normal file
121
esphomeyaml/components/binary_sensor/remote_receiver.py
Normal file
@@ -0,0 +1,121 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.components.remote_receiver import RemoteReceiverComponent, remote_ns
|
||||
from esphomeyaml.components.remote_transmitter import RC_SWITCH_RAW_SCHEMA, \
|
||||
RC_SWITCH_TYPE_A_SCHEMA, RC_SWITCH_TYPE_B_SCHEMA, RC_SWITCH_TYPE_C_SCHEMA, \
|
||||
RC_SWITCH_TYPE_D_SCHEMA, binary_code, build_rc_switch_protocol
|
||||
from esphomeyaml.const import CONF_ADDRESS, CONF_CHANNEL, CONF_CODE, CONF_COMMAND, CONF_DATA, \
|
||||
CONF_DEVICE, CONF_FAMILY, CONF_GROUP, CONF_LG, CONF_NAME, CONF_NBITS, CONF_NEC, \
|
||||
CONF_PANASONIC, CONF_PROTOCOL, CONF_RAW, CONF_RC_SWITCH_RAW, CONF_RC_SWITCH_TYPE_A, \
|
||||
CONF_RC_SWITCH_TYPE_B, CONF_RC_SWITCH_TYPE_C, CONF_RC_SWITCH_TYPE_D, CONF_SAMSUNG, CONF_SONY, \
|
||||
CONF_STATE
|
||||
from esphomeyaml.helpers import ArrayInitializer, Pvariable, get_variable
|
||||
|
||||
DEPENDENCIES = ['remote_receiver']
|
||||
|
||||
REMOTE_KEYS = [CONF_NEC, CONF_LG, CONF_SONY, CONF_PANASONIC, CONF_SAMSUNG, CONF_RAW,
|
||||
CONF_RC_SWITCH_RAW, CONF_RC_SWITCH_TYPE_A, CONF_RC_SWITCH_TYPE_B,
|
||||
CONF_RC_SWITCH_TYPE_C, CONF_RC_SWITCH_TYPE_D]
|
||||
|
||||
CONF_REMOTE_RECEIVER_ID = 'remote_receiver_id'
|
||||
CONF_RECEIVER_ID = 'receiver_id'
|
||||
|
||||
RemoteReceiver = remote_ns.RemoteReceiver
|
||||
LGReceiver = remote_ns.LGReceiver
|
||||
NECReceiver = remote_ns.NECReceiver
|
||||
PanasonicReceiver = remote_ns.PanasonicReceiver
|
||||
RawReceiver = remote_ns.RawReceiver
|
||||
SamsungReceiver = remote_ns.SamsungReceiver
|
||||
SonyReceiver = remote_ns.SonyReceiver
|
||||
RCSwitchRawReceiver = remote_ns.RCSwitchRawReceiver
|
||||
RCSwitchTypeAReceiver = remote_ns.RCSwitchTypeAReceiver
|
||||
RCSwitchTypeBReceiver = remote_ns.RCSwitchTypeBReceiver
|
||||
RCSwitchTypeCReceiver = remote_ns.RCSwitchTypeCReceiver
|
||||
RCSwitchTypeDReceiver = remote_ns.RCSwitchTypeDReceiver
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_LG): vol.Schema({
|
||||
vol.Required(CONF_DATA): cv.hex_uint32_t,
|
||||
vol.Optional(CONF_NBITS, default=28): vol.All(vol.Coerce(int), cv.one_of(28, 32)),
|
||||
}),
|
||||
vol.Optional(CONF_NEC): vol.Schema({
|
||||
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
|
||||
vol.Required(CONF_COMMAND): cv.hex_uint16_t,
|
||||
}),
|
||||
vol.Optional(CONF_SAMSUNG): vol.Schema({
|
||||
vol.Required(CONF_DATA): cv.hex_uint32_t,
|
||||
}),
|
||||
vol.Optional(CONF_SONY): vol.Schema({
|
||||
vol.Required(CONF_DATA): cv.hex_uint32_t,
|
||||
vol.Optional(CONF_NBITS, default=12): vol.All(vol.Coerce(int), cv.one_of(12, 15, 20)),
|
||||
}),
|
||||
vol.Optional(CONF_PANASONIC): vol.Schema({
|
||||
vol.Required(CONF_ADDRESS): cv.hex_uint16_t,
|
||||
vol.Required(CONF_COMMAND): cv.hex_uint32_t,
|
||||
}),
|
||||
vol.Optional(CONF_RAW): [vol.Any(vol.Coerce(int), cv.time_period_microseconds)],
|
||||
vol.Optional(CONF_RC_SWITCH_RAW): RC_SWITCH_RAW_SCHEMA,
|
||||
vol.Optional(CONF_RC_SWITCH_TYPE_A): RC_SWITCH_TYPE_A_SCHEMA,
|
||||
vol.Optional(CONF_RC_SWITCH_TYPE_B): RC_SWITCH_TYPE_B_SCHEMA,
|
||||
vol.Optional(CONF_RC_SWITCH_TYPE_C): RC_SWITCH_TYPE_C_SCHEMA,
|
||||
vol.Optional(CONF_RC_SWITCH_TYPE_D): RC_SWITCH_TYPE_D_SCHEMA,
|
||||
|
||||
cv.GenerateID(CONF_REMOTE_RECEIVER_ID): cv.use_variable_id(RemoteReceiverComponent),
|
||||
cv.GenerateID(CONF_RECEIVER_ID): cv.declare_variable_id(RemoteReceiver),
|
||||
}), cv.has_exactly_one_key(*REMOTE_KEYS))
|
||||
|
||||
|
||||
def receiver_base(full_config):
|
||||
name = full_config[CONF_NAME]
|
||||
key, config = next((k, v) for k, v in full_config.items() if k in REMOTE_KEYS)
|
||||
if key == CONF_LG:
|
||||
return LGReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
|
||||
elif key == CONF_NEC:
|
||||
return NECReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
|
||||
elif key == CONF_PANASONIC:
|
||||
return PanasonicReceiver.new(name, config[CONF_ADDRESS], config[CONF_COMMAND])
|
||||
elif key == CONF_SAMSUNG:
|
||||
return SamsungReceiver.new(name, config[CONF_DATA])
|
||||
elif key == CONF_SONY:
|
||||
return SonyReceiver.new(name, config[CONF_DATA], config[CONF_NBITS])
|
||||
elif key == CONF_RAW:
|
||||
data = ArrayInitializer(*config, multiline=False)
|
||||
return RawReceiver.new(name, data)
|
||||
elif key == CONF_RC_SWITCH_RAW:
|
||||
return RCSwitchRawReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||
binary_code(config[CONF_CODE]), len(config[CONF_CODE]))
|
||||
elif key == CONF_RC_SWITCH_TYPE_A:
|
||||
return RCSwitchTypeAReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||
binary_code(config[CONF_GROUP]),
|
||||
binary_code(config[CONF_DEVICE]),
|
||||
config[CONF_STATE])
|
||||
elif key == CONF_RC_SWITCH_TYPE_B:
|
||||
return RCSwitchTypeBReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||
config[CONF_ADDRESS], config[CONF_CHANNEL],
|
||||
config[CONF_STATE])
|
||||
elif key == CONF_RC_SWITCH_TYPE_C:
|
||||
return RCSwitchTypeCReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||
ord(config[CONF_FAMILY][0]) - ord('a'),
|
||||
config[CONF_GROUP], config[CONF_DEVICE],
|
||||
config[CONF_STATE])
|
||||
elif key == CONF_RC_SWITCH_TYPE_D:
|
||||
return RCSwitchTypeDReceiver.new(name, build_rc_switch_protocol(config[CONF_PROTOCOL]),
|
||||
ord(config[CONF_GROUP][0]) - ord('a'),
|
||||
config[CONF_DEVICE], config[CONF_STATE])
|
||||
else:
|
||||
raise NotImplementedError("Unknown receiver type {}".format(config))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
remote = None
|
||||
for remote in get_variable(config[CONF_REMOTE_RECEIVER_ID]):
|
||||
yield
|
||||
rhs = receiver_base(config)
|
||||
receiver = Pvariable(config[CONF_RECEIVER_ID], rhs)
|
||||
|
||||
binary_sensor.register_binary_sensor(remote.add_decoder(receiver), config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_REMOTE_RECEIVER'
|
||||
@@ -7,9 +7,9 @@ DEPENDENCIES = ['mqtt']
|
||||
|
||||
MakeStatusBinarySensor = Application.MakeStatusBinarySensor
|
||||
|
||||
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeStatusBinarySensor),
|
||||
}).extend(binary_sensor.BINARY_SENSOR_SCHEMA.schema)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -3,23 +3,26 @@ import voluptuous as vol
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.const import CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME
|
||||
from esphomeyaml.helpers import App, Application, process_lambda, variable
|
||||
from esphomeyaml.helpers import App, Application, process_lambda, variable, optional, bool_, add
|
||||
|
||||
MakeTemplateBinarySensor = Application.MakeTemplateBinarySensor
|
||||
|
||||
PLATFORM_SCHEMA = binary_sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(binary_sensor.BINARY_SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeTemplateBinarySensor),
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
}).extend(binary_sensor.BINARY_SENSOR_SCHEMA.schema)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
template_ = None
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], []):
|
||||
yield
|
||||
rhs = App.make_template_binary_sensor(config[CONF_NAME], template_)
|
||||
rhs = App.make_template_binary_sensor(config[CONF_NAME])
|
||||
make = variable(config[CONF_MAKE_ID], rhs)
|
||||
binary_sensor.setup_binary_sensor(make.Ptemplate_, make.Pmqtt, config)
|
||||
|
||||
template_ = None
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=optional.template(bool_)):
|
||||
yield
|
||||
add(make.Ptemplate_.set_template(template_))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_TEMPLATE_BINARY_SENSOR'
|
||||
|
||||
@@ -1,6 +1,10 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_ID, CONF_MQTT_ID
|
||||
from esphomeyaml.helpers import Pvariable, esphomelib_ns, setup_mqtt_component
|
||||
from esphomeyaml.const import CONF_ID, CONF_MQTT_ID, CONF_INTERNAL
|
||||
from esphomeyaml.helpers import Pvariable, esphomelib_ns, setup_mqtt_component, add, \
|
||||
TemplateArguments, get_variable
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
@@ -21,8 +25,12 @@ COVER_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTCoverComponent),
|
||||
})
|
||||
|
||||
COVER_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(COVER_SCHEMA.schema)
|
||||
|
||||
|
||||
def setup_cover_core_(cover_var, mqtt_var, config):
|
||||
if CONF_INTERNAL in config:
|
||||
add(cover_var.set_internal(config[CONF_INTERNAL]))
|
||||
setup_mqtt_component(mqtt_var, config)
|
||||
|
||||
|
||||
@@ -33,3 +41,50 @@ def setup_cover(cover_obj, mqtt_obj, config):
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_COVER'
|
||||
|
||||
CONF_COVER_OPEN = 'cover.open'
|
||||
COVER_OPEN_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_COVER_OPEN, COVER_OPEN_ACTION_SCHEMA)
|
||||
def cover_open_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_open_action(template_arg)
|
||||
type = OpenAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_COVER_CLOSE = 'cover.close'
|
||||
COVER_CLOSE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_COVER_CLOSE, COVER_CLOSE_ACTION_SCHEMA)
|
||||
def cover_close_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_close_action(template_arg)
|
||||
type = CloseAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_COVER_STOP = 'cover.stop'
|
||||
COVER_STOP_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_COVER_STOP, COVER_STOP_ACTION_SCHEMA)
|
||||
def cover_stop_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_stop_action(template_arg)
|
||||
type = StopAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
@@ -5,48 +5,43 @@ from esphomeyaml import automation
|
||||
from esphomeyaml.components import cover
|
||||
from esphomeyaml.const import CONF_CLOSE_ACTION, CONF_LAMBDA, CONF_MAKE_ID, CONF_NAME, \
|
||||
CONF_OPEN_ACTION, CONF_STOP_ACTION, CONF_OPTIMISTIC
|
||||
from esphomeyaml.helpers import App, Application, NoArg, add, process_lambda, variable
|
||||
from esphomeyaml.helpers import App, Application, NoArg, add, process_lambda, variable, optional
|
||||
|
||||
MakeTemplateCover = Application.MakeTemplateCover
|
||||
|
||||
PLATFORM_SCHEMA = vol.All(cover.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(cover.COVER_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeTemplateCover),
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_OPTIMISTIC): cv.boolean,
|
||||
vol.Optional(CONF_OPEN_ACTION): automation.ACTIONS_SCHEMA,
|
||||
vol.Optional(CONF_CLOSE_ACTION): automation.ACTIONS_SCHEMA,
|
||||
vol.Optional(CONF_STOP_ACTION): automation.ACTIONS_SCHEMA,
|
||||
}).extend(cover.COVER_SCHEMA.schema), cv.has_at_least_one_key(CONF_LAMBDA, CONF_OPTIMISTIC))
|
||||
vol.Optional(CONF_OPEN_ACTION): automation.validate_automation(single=True),
|
||||
vol.Optional(CONF_CLOSE_ACTION): automation.validate_automation(single=True),
|
||||
vol.Optional(CONF_STOP_ACTION): automation.validate_automation(single=True),
|
||||
}), cv.has_at_least_one_key(CONF_LAMBDA, CONF_OPTIMISTIC))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_template_cover(config[CONF_NAME])
|
||||
make = variable(config[CONF_MAKE_ID], rhs)
|
||||
|
||||
cover.setup_cover(make.Ptemplate_, make.Pmqtt, config)
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
template_ = None
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], []):
|
||||
for template_ in process_lambda(config[CONF_LAMBDA], [],
|
||||
return_type=optional.template(cover.CoverState)):
|
||||
yield
|
||||
add(make.Ptemplate_.set_state_lambda(template_))
|
||||
if CONF_OPEN_ACTION in config:
|
||||
actions = None
|
||||
for actions in automation.build_actions(config[CONF_OPEN_ACTION], NoArg):
|
||||
yield
|
||||
add(make.Ptemplate_.add_open_actions(actions))
|
||||
automation.build_automation(make.Ptemplate_.get_open_trigger(), NoArg,
|
||||
config[CONF_OPEN_ACTION])
|
||||
if CONF_CLOSE_ACTION in config:
|
||||
actions = None
|
||||
for actions in automation.build_actions(config[CONF_CLOSE_ACTION], NoArg):
|
||||
yield
|
||||
add(make.Ptemplate_.add_close_actions(actions))
|
||||
automation.build_automation(make.Ptemplate_.get_close_trigger(), NoArg,
|
||||
config[CONF_CLOSE_ACTION])
|
||||
if CONF_STOP_ACTION in config:
|
||||
actions = None
|
||||
for actions in automation.build_actions(config[CONF_STOP_ACTION], NoArg):
|
||||
yield
|
||||
add(make.Ptemplate_.add_stop_actions(actions))
|
||||
automation.build_automation(make.Ptemplate_.get_stop_trigger(), NoArg,
|
||||
config[CONF_STOP_ACTION])
|
||||
if CONF_OPTIMISTIC in config:
|
||||
add(make.Ptemplate_.set_optimistic(config[CONF_OPTIMISTIC]))
|
||||
|
||||
cover.setup_cover(make.Ptemplate_, make.Pmqtt, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_TEMPLATE_COVER'
|
||||
|
||||
@@ -11,7 +11,7 @@ DallasComponent = sensor.sensor_ns.DallasComponent
|
||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(DallasComponent),
|
||||
vol.Required(CONF_PIN): pins.input_output_pin,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
})])
|
||||
|
||||
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml import config_validation as cv, pins
|
||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
||||
from esphomeyaml.const import CONF_ID, CONF_NUMBER, CONF_RUN_CYCLES, CONF_RUN_DURATION, \
|
||||
CONF_SLEEP_DURATION, CONF_WAKEUP_PIN
|
||||
from esphomeyaml.helpers import App, Pvariable, add, gpio_input_pin_expression, esphomelib_ns
|
||||
from esphomeyaml.helpers import App, Pvariable, add, gpio_input_pin_expression, esphomelib_ns, \
|
||||
TemplateArguments, get_variable
|
||||
|
||||
|
||||
def validate_pin_number(value):
|
||||
@@ -15,12 +17,24 @@ def validate_pin_number(value):
|
||||
|
||||
|
||||
DeepSleepComponent = esphomelib_ns.DeepSleepComponent
|
||||
EnterDeepSleepAction = esphomelib_ns.EnterDeepSleepAction
|
||||
PreventDeepSleepAction = esphomelib_ns.PreventDeepSleepAction
|
||||
|
||||
WAKEUP_PIN_MODES = {
|
||||
'IGNORE': esphomelib_ns.WAKEUP_PIN_MODE_IGNORE,
|
||||
'KEEP_AWAKE': esphomelib_ns.WAKEUP_PIN_MODE_KEEP_AWAKE,
|
||||
'INVERT_WAKEUP': esphomelib_ns.WAKEUP_PIN_MODE_INVERT_WAKEUP,
|
||||
}
|
||||
|
||||
CONF_WAKEUP_PIN_MODE = 'wakeup_pin_mode'
|
||||
|
||||
CONFIG_SCHEMA = vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(DeepSleepComponent),
|
||||
vol.Optional(CONF_SLEEP_DURATION): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_WAKEUP_PIN): vol.All(cv.only_on_esp32, pins.GPIO_INTERNAL_INPUT_PIN_SCHEMA,
|
||||
vol.Optional(CONF_WAKEUP_PIN): vol.All(cv.only_on_esp32, pins.internal_gpio_input_pin_schema,
|
||||
validate_pin_number),
|
||||
vol.Optional(CONF_WAKEUP_PIN_MODE): vol.All(cv.only_on_esp32, vol.Upper,
|
||||
cv.one_of(*WAKEUP_PIN_MODES)),
|
||||
vol.Optional(CONF_RUN_CYCLES): cv.positive_int,
|
||||
vol.Optional(CONF_RUN_DURATION): cv.positive_time_period_milliseconds,
|
||||
})
|
||||
@@ -36,6 +50,8 @@ def to_code(config):
|
||||
for pin in gpio_input_pin_expression(config[CONF_WAKEUP_PIN]):
|
||||
yield
|
||||
add(deep_sleep.set_wakeup_pin(pin))
|
||||
if CONF_WAKEUP_PIN_MODE in config:
|
||||
add(deep_sleep.set_wakeup_pin_mode(WAKEUP_PIN_MODES[config[CONF_WAKEUP_PIN_MODE]]))
|
||||
if CONF_RUN_CYCLES in config:
|
||||
add(deep_sleep.set_run_cycles(config[CONF_RUN_CYCLES]))
|
||||
if CONF_RUN_DURATION in config:
|
||||
@@ -43,3 +59,35 @@ def to_code(config):
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_DEEP_SLEEP'
|
||||
|
||||
|
||||
CONF_DEEP_SLEEP_ENTER = 'deep_sleep.enter'
|
||||
DEEP_SLEEP_ENTER_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(DeepSleepComponent),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_ENTER, DEEP_SLEEP_ENTER_ACTION_SCHEMA)
|
||||
def deep_sleep_enter_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_enter_deep_sleep_action(template_arg)
|
||||
type = EnterDeepSleepAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_DEEP_SLEEP_PREVENT = 'deep_sleep.prevent'
|
||||
DEEP_SLEEP_PREVENT_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(DeepSleepComponent),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_DEEP_SLEEP_PREVENT, DEEP_SLEEP_PREVENT_ACTION_SCHEMA)
|
||||
def deep_sleep_prevent_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_prevent_deep_sleep_action(template_arg)
|
||||
type = PreventDeepSleepAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
56
esphomeyaml/components/display/__init__.py
Normal file
56
esphomeyaml/components/display/__init__.py
Normal file
@@ -0,0 +1,56 @@
|
||||
# coding=utf-8
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_LAMBDA, CONF_ROTATION, CONF_UPDATE_INTERVAL
|
||||
from esphomeyaml.helpers import add, add_job, esphomelib_ns
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
})
|
||||
|
||||
display_ns = esphomelib_ns.namespace('display')
|
||||
DisplayBuffer = display_ns.DisplayBuffer
|
||||
DisplayBufferRef = DisplayBuffer.operator('ref')
|
||||
|
||||
DISPLAY_ROTATIONS = {
|
||||
0: display_ns.DISPLAY_ROTATION_0_DEGREES,
|
||||
90: display_ns.DISPLAY_ROTATION_90_DEGREES,
|
||||
180: display_ns.DISPLAY_ROTATION_180_DEGREES,
|
||||
270: display_ns.DISPLAY_ROTATION_270_DEGREES,
|
||||
}
|
||||
|
||||
|
||||
def validate_rotation(value):
|
||||
value = cv.string(value)
|
||||
if value.endswith(u"°"):
|
||||
value = value[:-1]
|
||||
try:
|
||||
value = int(value)
|
||||
except ValueError:
|
||||
raise vol.Invalid(u"Expected integer for rotation")
|
||||
return cv.one_of(*DISPLAY_ROTATIONS)(value)
|
||||
|
||||
|
||||
BASIC_DISPLAY_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
vol.Optional(CONF_LAMBDA): cv.lambda_,
|
||||
})
|
||||
|
||||
FULL_DISPLAY_PLATFORM_SCHEMA = BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_ROTATION): validate_rotation,
|
||||
})
|
||||
|
||||
|
||||
def setup_display_core_(display_var, config):
|
||||
if CONF_UPDATE_INTERVAL in config:
|
||||
add(display_var.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
||||
if CONF_ROTATION in config:
|
||||
add(display_var.set_rotation(DISPLAY_ROTATIONS[config[CONF_ROTATION]]))
|
||||
|
||||
|
||||
def setup_display(display_var, config):
|
||||
add_job(setup_display_core_, display_var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_DISPLAY'
|
||||
72
esphomeyaml/components/display/lcd_gpio.py
Normal file
72
esphomeyaml/components/display/lcd_gpio.py
Normal file
@@ -0,0 +1,72 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.const import CONF_DATA_PINS, CONF_DIMENSIONS, CONF_ENABLE_PIN, CONF_ID, \
|
||||
CONF_LAMBDA, CONF_RS_PIN, CONF_RW_PIN
|
||||
from esphomeyaml.helpers import App, Pvariable, add, gpio_output_pin_expression, process_lambda
|
||||
|
||||
GPIOLCDDisplay = display.display_ns.GPIOLCDDisplay
|
||||
LCDDisplay = display.display_ns.LCDDisplay
|
||||
LCDDisplayRef = LCDDisplay.operator('ref')
|
||||
|
||||
|
||||
def validate_lcd_dimensions(value):
|
||||
value = cv.dimensions(value)
|
||||
if value[0] > 0x40:
|
||||
raise vol.Invalid("LCD displays can't have more than 64 columns")
|
||||
if value[1] > 4:
|
||||
raise vol.Invalid("LCD displays can't have more than 4 rows")
|
||||
return value
|
||||
|
||||
|
||||
def validate_pin_length(value):
|
||||
if len(value) != 4 and len(value) != 8:
|
||||
raise vol.Invalid("LCD Displays can either operate in 4-pin or 8-pin mode,"
|
||||
"not {}-pin mode".format(len(value)))
|
||||
return value
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(GPIOLCDDisplay),
|
||||
vol.Required(CONF_DIMENSIONS): validate_lcd_dimensions,
|
||||
|
||||
vol.Required(CONF_DATA_PINS): vol.All([pins.gpio_output_pin_schema], validate_pin_length),
|
||||
vol.Required(CONF_ENABLE_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_RS_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_RW_PIN): pins.gpio_output_pin_schema,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_gpio_lcd_display(config[CONF_DIMENSIONS][0], config[CONF_DIMENSIONS][1])
|
||||
lcd = Pvariable(config[CONF_ID], rhs)
|
||||
pins_ = []
|
||||
for conf in config[CONF_DATA_PINS]:
|
||||
for pin in gpio_output_pin_expression(conf):
|
||||
yield
|
||||
pins_.append(pin)
|
||||
add(lcd.set_data_pins(*pins_))
|
||||
for enable in gpio_output_pin_expression(config[CONF_ENABLE_PIN]):
|
||||
yield
|
||||
add(lcd.set_enable_pin(enable))
|
||||
|
||||
for rs in gpio_output_pin_expression(config[CONF_RS_PIN]):
|
||||
yield
|
||||
add(lcd.set_rs_pin(rs))
|
||||
|
||||
if CONF_RW_PIN in config:
|
||||
for rw in gpio_output_pin_expression(config[CONF_RW_PIN]):
|
||||
yield
|
||||
add(lcd.set_rw_pin(rw))
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')]):
|
||||
yield
|
||||
add(lcd.set_writer(lambda_))
|
||||
|
||||
display.setup_display(lcd, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_LCD_DISPLAY'
|
||||
35
esphomeyaml/components/display/lcd_pcf8574.py
Normal file
35
esphomeyaml/components/display/lcd_pcf8574.py
Normal file
@@ -0,0 +1,35 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.components.display.lcd_gpio import LCDDisplayRef, validate_lcd_dimensions
|
||||
from esphomeyaml.const import CONF_ADDRESS, CONF_DIMENSIONS, CONF_ID, CONF_LAMBDA
|
||||
from esphomeyaml.helpers import App, Pvariable, add, process_lambda
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
|
||||
PCF8574LCDDisplay = display.display_ns.PCF8574LCDDisplay
|
||||
|
||||
PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(PCF8574LCDDisplay),
|
||||
vol.Required(CONF_DIMENSIONS): validate_lcd_dimensions,
|
||||
vol.Optional(CONF_ADDRESS): cv.i2c_address,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_pcf8574_lcd_display(config[CONF_DIMENSIONS][0], config[CONF_DIMENSIONS][1])
|
||||
lcd = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_ADDRESS in config:
|
||||
add(lcd.set_address(config[CONF_ADDRESS]))
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(LCDDisplayRef, 'it')]):
|
||||
yield
|
||||
add(lcd.set_writer(lambda_))
|
||||
|
||||
display.setup_display(lcd, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = ['-DUSE_LCD_DISPLAY', '-DUSE_LCD_DISPLAY_PCF8574']
|
||||
48
esphomeyaml/components/display/max7219.py
Normal file
48
esphomeyaml/components/display/max7219.py
Normal file
@@ -0,0 +1,48 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.components.spi import SPIComponent
|
||||
from esphomeyaml.const import CONF_CS_PIN, CONF_ID, CONF_INTENSITY, CONF_LAMBDA, CONF_NUM_CHIPS, \
|
||||
CONF_SPI_ID
|
||||
from esphomeyaml.helpers import App, Pvariable, add, get_variable, gpio_output_pin_expression, \
|
||||
process_lambda
|
||||
|
||||
DEPENDENCIES = ['spi']
|
||||
|
||||
MAX7219Component = display.display_ns.MAX7219Component
|
||||
MAX7219ComponentRef = MAX7219Component.operator('ref')
|
||||
|
||||
PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(MAX7219Component),
|
||||
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
||||
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
||||
|
||||
vol.Optional(CONF_NUM_CHIPS): vol.All(cv.uint8_t, vol.Range(min=1)),
|
||||
vol.Optional(CONF_INTENSITY): vol.All(cv.uint8_t, vol.Range(min=0, max=15)),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for spi in get_variable(config[CONF_SPI_ID]):
|
||||
yield
|
||||
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
|
||||
yield
|
||||
rhs = App.make_max7219(spi, cs)
|
||||
max7219 = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_NUM_CHIPS in config:
|
||||
add(max7219.set_num_chips(config[CONF_NUM_CHIPS]))
|
||||
if CONF_INTENSITY in config:
|
||||
add(max7219.set_intensity(config[CONF_INTENSITY]))
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(MAX7219ComponentRef, 'it')]):
|
||||
yield
|
||||
add(max7219.set_writer(lambda_))
|
||||
|
||||
display.setup_display(max7219, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_MAX7219'
|
||||
32
esphomeyaml/components/display/nextion.py
Normal file
32
esphomeyaml/components/display/nextion.py
Normal file
@@ -0,0 +1,32 @@
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.components.uart import UARTComponent
|
||||
from esphomeyaml.const import CONF_ID, CONF_LAMBDA, CONF_UART_ID
|
||||
from esphomeyaml.helpers import App, Pvariable, add, get_variable, process_lambda
|
||||
|
||||
DEPENDENCIES = ['uart']
|
||||
|
||||
Nextion = display.display_ns.Nextion
|
||||
NextionRef = Nextion.operator('ref')
|
||||
|
||||
PLATFORM_SCHEMA = display.BASIC_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(Nextion),
|
||||
cv.GenerateID(CONF_UART_ID): cv.use_variable_id(UARTComponent),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for uart in get_variable(config[CONF_UART_ID]):
|
||||
yield
|
||||
rhs = App.make_nextion(uart)
|
||||
nextion = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(NextionRef, 'it')]):
|
||||
yield
|
||||
add(nextion.set_writer(lambda_))
|
||||
|
||||
display.setup_display(nextion, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_NEXTION'
|
||||
46
esphomeyaml/components/display/ssd1306_i2c.py
Normal file
46
esphomeyaml/components/display/ssd1306_i2c.py
Normal file
@@ -0,0 +1,46 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.components.display import ssd1306_spi
|
||||
from esphomeyaml.const import CONF_ADDRESS, CONF_EXTERNAL_VCC, CONF_ID, \
|
||||
CONF_MODEL, CONF_RESET_PIN, CONF_LAMBDA
|
||||
from esphomeyaml.helpers import App, Pvariable, add, \
|
||||
gpio_output_pin_expression, process_lambda
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
|
||||
I2CSSD1306 = display.display_ns.I2CSSD1306
|
||||
|
||||
PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(I2CSSD1306),
|
||||
vol.Required(CONF_MODEL): ssd1306_spi.SSD1306_MODEL,
|
||||
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_EXTERNAL_VCC): cv.boolean,
|
||||
vol.Optional(CONF_ADDRESS): cv.i2c_address,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
ssd = Pvariable(config[CONF_ID], App.make_i2c_ssd1306())
|
||||
add(ssd.set_model(ssd1306_spi.MODELS[config[CONF_MODEL]]))
|
||||
|
||||
if CONF_RESET_PIN in config:
|
||||
for reset in gpio_output_pin_expression(config[CONF_RESET_PIN]):
|
||||
yield
|
||||
add(ssd.set_reset_pin(reset))
|
||||
if CONF_EXTERNAL_VCC in config:
|
||||
add(ssd.set_external_vcc(config[CONF_EXTERNAL_VCC]))
|
||||
if CONF_ADDRESS in config:
|
||||
add(ssd.set_address(config[CONF_ADDRESS]))
|
||||
if CONF_LAMBDA in config:
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA],
|
||||
[(display.DisplayBufferRef, 'it')]):
|
||||
yield
|
||||
add(ssd.set_writer(lambda_))
|
||||
|
||||
display.setup_display(ssd, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_SSD1306'
|
||||
68
esphomeyaml/components/display/ssd1306_spi.py
Normal file
68
esphomeyaml/components/display/ssd1306_spi.py
Normal file
@@ -0,0 +1,68 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.components.spi import SPIComponent
|
||||
from esphomeyaml.const import CONF_CS_PIN, CONF_DC_PIN, CONF_EXTERNAL_VCC, \
|
||||
CONF_ID, CONF_MODEL, \
|
||||
CONF_RESET_PIN, CONF_SPI_ID, CONF_LAMBDA
|
||||
from esphomeyaml.helpers import App, Pvariable, add, get_variable, \
|
||||
gpio_output_pin_expression, process_lambda
|
||||
|
||||
DEPENDENCIES = ['spi']
|
||||
|
||||
SPISSD1306 = display.display_ns.SPISSD1306
|
||||
|
||||
MODELS = {
|
||||
'SSD1306_128X32': display.display_ns.SSD1306_MODEL_128_32,
|
||||
'SSD1306_128X64': display.display_ns.SSD1306_MODEL_128_64,
|
||||
'SSD1306_96X16': display.display_ns.SSD1306_MODEL_96_16,
|
||||
'SSD1306_64X48': display.display_ns.SSD1306_MODEL_64_48,
|
||||
'SH1106_128X32': display.display_ns.SH1106_MODEL_128_32,
|
||||
'SH1106_128X64': display.display_ns.SH1106_MODEL_128_64,
|
||||
'SH1106_96X16': display.display_ns.SH1106_MODEL_96_16,
|
||||
'SH1106_64X48': display.display_ns.SH1106_MODEL_64_48,
|
||||
}
|
||||
|
||||
SSD1306_MODEL = vol.All(vol.Upper, vol.Replace(' ', '_'), cv.one_of(*MODELS))
|
||||
|
||||
PLATFORM_SCHEMA = display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(SPISSD1306),
|
||||
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
||||
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_DC_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_MODEL): SSD1306_MODEL,
|
||||
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_EXTERNAL_VCC): cv.boolean,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for spi in get_variable(config[CONF_SPI_ID]):
|
||||
yield
|
||||
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
|
||||
yield
|
||||
for dc in gpio_output_pin_expression(config[CONF_DC_PIN]):
|
||||
yield
|
||||
|
||||
rhs = App.make_spi_ssd1306(spi, cs, dc)
|
||||
ssd = Pvariable(config[CONF_ID], rhs)
|
||||
add(ssd.set_model(MODELS[config[CONF_MODEL]]))
|
||||
|
||||
if CONF_RESET_PIN in config:
|
||||
for reset in gpio_output_pin_expression(config[CONF_RESET_PIN]):
|
||||
yield
|
||||
add(ssd.set_reset_pin(reset))
|
||||
if CONF_EXTERNAL_VCC in config:
|
||||
add(ssd.set_external_vcc(config[CONF_EXTERNAL_VCC]))
|
||||
if CONF_LAMBDA in config:
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA],
|
||||
[(display.DisplayBufferRef, 'it')]):
|
||||
yield
|
||||
add(ssd.set_writer(lambda_))
|
||||
|
||||
display.setup_display(ssd, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_SSD1306'
|
||||
84
esphomeyaml/components/display/waveshare_epaper.py
Normal file
84
esphomeyaml/components/display/waveshare_epaper.py
Normal file
@@ -0,0 +1,84 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.components.spi import SPIComponent
|
||||
from esphomeyaml.const import CONF_BUSY_PIN, CONF_CS_PIN, CONF_DC_PIN, CONF_FULL_UPDATE_EVERY, \
|
||||
CONF_ID, CONF_LAMBDA, CONF_MODEL, CONF_RESET_PIN, CONF_SPI_ID
|
||||
from esphomeyaml.helpers import App, Pvariable, add, get_variable, gpio_input_pin_expression, \
|
||||
gpio_output_pin_expression, process_lambda
|
||||
|
||||
DEPENDENCIES = ['spi']
|
||||
|
||||
WaveshareEPaperTypeA = display.display_ns.WaveshareEPaperTypeA
|
||||
WaveshareEPaper = display.display_ns.WaveshareEPaper
|
||||
|
||||
MODELS = {
|
||||
'1.54in': ('a', display.display_ns.WAVESHARE_EPAPER_1_54_IN),
|
||||
'2.13in': ('a', display.display_ns.WAVESHARE_EPAPER_2_13_IN),
|
||||
'2.90in': ('a', display.display_ns.WAVESHARE_EPAPER_2_9_IN),
|
||||
'2.70in': ('b', display.display_ns.WAVESHARE_EPAPER_2_7_IN),
|
||||
'4.20in': ('b', display.display_ns.WAVESHARE_EPAPER_4_2_IN),
|
||||
'7.50in': ('b', display.display_ns.WAVESHARE_EPAPER_7_5_IN),
|
||||
}
|
||||
|
||||
|
||||
def validate_full_update_every_only_type_a(value):
|
||||
if CONF_FULL_UPDATE_EVERY not in value:
|
||||
return value
|
||||
if MODELS[value[CONF_MODEL]][0] != 'a':
|
||||
raise vol.Invalid("The 'full_update_every' option is only available for models "
|
||||
"'1.54in', '2.13in' and '2.90in'.")
|
||||
return value
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = vol.All(display.FULL_DISPLAY_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(None),
|
||||
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
||||
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_DC_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Required(CONF_MODEL): vol.All(vol.Lower, cv.one_of(*MODELS)),
|
||||
vol.Optional(CONF_RESET_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_BUSY_PIN): pins.gpio_input_pin_schema,
|
||||
vol.Optional(CONF_FULL_UPDATE_EVERY): cv.uint32_t,
|
||||
}), validate_full_update_every_only_type_a)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for spi in get_variable(config[CONF_SPI_ID]):
|
||||
yield
|
||||
for cs in gpio_output_pin_expression(config[CONF_CS_PIN]):
|
||||
yield
|
||||
for dc in gpio_output_pin_expression(config[CONF_DC_PIN]):
|
||||
yield
|
||||
|
||||
model_type, model = MODELS[config[CONF_MODEL]]
|
||||
if model_type == 'a':
|
||||
rhs = App.make_waveshare_epaper_type_a(spi, cs, dc, model)
|
||||
epaper = Pvariable(config[CONF_ID], rhs, type=WaveshareEPaperTypeA)
|
||||
elif model_type == 'b':
|
||||
rhs = App.make_waveshare_epaper_type_b(spi, cs, dc, model)
|
||||
epaper = Pvariable(config[CONF_ID], rhs, type=WaveshareEPaper)
|
||||
else:
|
||||
raise NotImplementedError()
|
||||
|
||||
if CONF_LAMBDA in config:
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], [(display.DisplayBufferRef, 'it')]):
|
||||
yield
|
||||
add(epaper.set_writer(lambda_))
|
||||
if CONF_RESET_PIN in config:
|
||||
for reset in gpio_output_pin_expression(config[CONF_RESET_PIN]):
|
||||
yield
|
||||
add(epaper.set_reset_pin(reset))
|
||||
if CONF_BUSY_PIN in config:
|
||||
for reset in gpio_input_pin_expression(config[CONF_BUSY_PIN]):
|
||||
yield
|
||||
add(epaper.set_busy_pin(reset))
|
||||
if CONF_FULL_UPDATE_EVERY in config:
|
||||
add(epaper.set_full_update_every(config[CONF_FULL_UPDATE_EVERY]))
|
||||
|
||||
display.setup_display(epaper, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_WAVESHARE_EPAPER'
|
||||
@@ -1,24 +1,5 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml import config_validation as cv
|
||||
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32
|
||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns
|
||||
|
||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||
|
||||
ESP32BLETracker = esphomelib_ns.ESP32BLETracker
|
||||
|
||||
CONFIG_SCHEMA = vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(ESP32BLETracker),
|
||||
vol.Optional(CONF_SCAN_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_esp32_ble_tracker()
|
||||
ble = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_SCAN_INTERVAL in config:
|
||||
add(ble.set_scan_interval(config[CONF_SCAN_INTERVAL]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ESP32_BLE_TRACKER'
|
||||
CONFIG_SCHEMA = cv.invalid("The 'esp32_ble' component has been renamed to the 'esp32_ble_tracker' "
|
||||
"component in order to avoid confusion with the new 'esp32_ble_beacon' "
|
||||
"component.")
|
||||
|
||||
35
esphomeyaml/components/esp32_ble_beacon.py
Normal file
35
esphomeyaml/components/esp32_ble_beacon.py
Normal file
@@ -0,0 +1,35 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml import config_validation as cv
|
||||
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32, CONF_UUID, CONF_TYPE
|
||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, RawExpression, ArrayInitializer
|
||||
|
||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||
|
||||
ESP32BLEBeacon = esphomelib_ns.ESP32BLEBeacon
|
||||
|
||||
CONF_MAJOR = 'major'
|
||||
CONF_MINOR = 'minor'
|
||||
|
||||
CONFIG_SCHEMA = vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(ESP32BLEBeacon),
|
||||
vol.Required(CONF_TYPE): vol.All(vol.Upper, cv.one_of('IBEACON')),
|
||||
vol.Required(CONF_UUID): cv.uuid,
|
||||
vol.Optional(CONF_MAJOR): cv.uint16_t,
|
||||
vol.Optional(CONF_MINOR): cv.uint16_t,
|
||||
vol.Optional(CONF_SCAN_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
uuid = config[CONF_UUID].hex
|
||||
uuid_arr = [RawExpression('0x{}'.format(uuid[i:i+2])) for i in range(0, len(uuid), 2)]
|
||||
rhs = App.make_esp32_ble_beacon(ArrayInitializer(*uuid_arr, multiline=False))
|
||||
ble = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_MAJOR in config:
|
||||
add(ble.set_major(config[CONF_MAJOR]))
|
||||
if CONF_MINOR in config:
|
||||
add(ble.set_minor(config[CONF_MINOR]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ESP32_BLE_BEACON'
|
||||
31
esphomeyaml/components/esp32_ble_tracker.py
Normal file
31
esphomeyaml/components/esp32_ble_tracker.py
Normal file
@@ -0,0 +1,31 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml import config_validation as cv
|
||||
from esphomeyaml.const import CONF_ID, CONF_SCAN_INTERVAL, ESP_PLATFORM_ESP32
|
||||
from esphomeyaml.core import HexInt
|
||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, ArrayInitializer
|
||||
|
||||
ESP_PLATFORMS = [ESP_PLATFORM_ESP32]
|
||||
|
||||
CONF_ESP32_BLE_ID = 'esp32_ble_id'
|
||||
ESP32BLETracker = esphomelib_ns.ESP32BLETracker
|
||||
|
||||
CONFIG_SCHEMA = vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(ESP32BLETracker),
|
||||
vol.Optional(CONF_SCAN_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
})
|
||||
|
||||
|
||||
def make_address_array(address):
|
||||
addr = [HexInt(i) for i in address.parts]
|
||||
return ArrayInitializer(*addr, multiline=False)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_esp32_ble_tracker()
|
||||
ble = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_SCAN_INTERVAL in config:
|
||||
add(ble.set_scan_interval(config[CONF_SCAN_INTERVAL]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_ESP32_BLE_TRACKER'
|
||||
@@ -41,7 +41,10 @@ VOLTAGE_ATTENUATION = {
|
||||
'0V': global_ns.TOUCH_HVOLT_ATTEN_0V,
|
||||
}
|
||||
|
||||
ESP32TouchComponent = binary_sensor.binary_sensor_ns.ESP32TouchComponent
|
||||
|
||||
CONFIG_SCHEMA = vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(ESP32TouchComponent),
|
||||
vol.Optional(CONF_SETUP_MODE): cv.boolean,
|
||||
vol.Optional(CONF_IIR_FILTER): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_SLEEP_DURATION):
|
||||
@@ -53,8 +56,6 @@ CONFIG_SCHEMA = vol.Schema({
|
||||
vol.Optional(CONF_VOLTAGE_ATTENUATION): validate_voltage(VOLTAGE_ATTENUATION),
|
||||
})
|
||||
|
||||
ESP32TouchComponent = binary_sensor.binary_sensor_ns.ESP32TouchComponent
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_esp32_touch_component()
|
||||
@@ -64,7 +65,7 @@ def to_code(config):
|
||||
if CONF_IIR_FILTER in config:
|
||||
add(touch.set_iir_filter(config[CONF_IIR_FILTER]))
|
||||
if CONF_SLEEP_DURATION in config:
|
||||
sleep_duration = int(config[CONF_SLEEP_DURATION].total_microseconds * 6.6667)
|
||||
sleep_duration = int(config[CONF_SLEEP_DURATION].total_microseconds * 0.15)
|
||||
add(touch.set_sleep_duration(sleep_duration))
|
||||
if CONF_MEASUREMENT_DURATION in config:
|
||||
measurement_duration = int(config[CONF_MEASUREMENT_DURATION].total_microseconds * 0.125)
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_ID, CONF_MQTT_ID, CONF_OSCILLATION_COMMAND_TOPIC, \
|
||||
CONF_OSCILLATION_STATE_TOPIC, CONF_SPEED_COMMAND_TOPIC, CONF_SPEED_STATE_TOPIC
|
||||
from esphomeyaml.helpers import Application, Pvariable, add, esphomelib_ns, setup_mqtt_component
|
||||
CONF_OSCILLATION_STATE_TOPIC, CONF_SPEED_COMMAND_TOPIC, CONF_SPEED_STATE_TOPIC, CONF_INTERNAL, \
|
||||
CONF_SPEED, CONF_OSCILLATING
|
||||
from esphomeyaml.helpers import Application, Pvariable, add, esphomelib_ns, setup_mqtt_component, \
|
||||
TemplateArguments, get_variable, templatable, bool_
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
@@ -29,6 +32,8 @@ FAN_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
|
||||
vol.Optional(CONF_OSCILLATION_COMMAND_TOPIC): cv.subscribe_topic,
|
||||
})
|
||||
|
||||
FAN_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(FAN_SCHEMA.schema)
|
||||
|
||||
|
||||
FAN_SPEEDS = {
|
||||
'OFF': FAN_SPEED_OFF,
|
||||
@@ -43,6 +48,9 @@ def validate_fan_speed(value):
|
||||
|
||||
|
||||
def setup_fan_core_(fan_var, mqtt_var, config):
|
||||
if CONF_INTERNAL in config:
|
||||
add(fan_var.set_internal(config[CONF_INTERNAL]))
|
||||
|
||||
if CONF_OSCILLATION_STATE_TOPIC in config:
|
||||
add(mqtt_var.set_custom_oscillation_state_topic(config[CONF_OSCILLATION_STATE_TOPIC]))
|
||||
if CONF_OSCILLATION_COMMAND_TOPIC in config:
|
||||
@@ -61,3 +69,62 @@ def setup_fan(fan_obj, mqtt_obj, config):
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_FAN'
|
||||
|
||||
|
||||
CONF_FAN_TOGGLE = 'fan.toggle'
|
||||
FAN_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_FAN_TOGGLE, FAN_TOGGLE_ACTION_SCHEMA)
|
||||
def fan_toggle_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_toggle_action(template_arg)
|
||||
type = ToggleAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_FAN_TURN_OFF = 'fan.turn_off'
|
||||
FAN_TURN_OFF_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_FAN_TURN_OFF, FAN_TURN_OFF_ACTION_SCHEMA)
|
||||
def fan_turn_off_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_off_action(template_arg)
|
||||
type = TurnOffAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_FAN_TURN_ON = 'fan.turn_on'
|
||||
FAN_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_OSCILLATING): cv.templatable(cv.boolean),
|
||||
vol.Optional(CONF_SPEED): cv.templatable(validate_fan_speed),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_FAN_TURN_ON, FAN_TURN_ON_ACTION_SCHEMA)
|
||||
def fan_turn_on_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_on_action(template_arg)
|
||||
type = TurnOnAction.template(arg_type)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
if CONF_OSCILLATING in config:
|
||||
for template_ in templatable(config[CONF_OSCILLATING], arg_type, bool_):
|
||||
yield None
|
||||
add(action.set_oscillating(template_))
|
||||
if CONF_SPEED in config:
|
||||
for template_ in templatable(config[CONF_SPEED], arg_type, FanSpeed):
|
||||
yield None
|
||||
add(action.set_speed(template_))
|
||||
yield action
|
||||
|
||||
@@ -5,11 +5,11 @@ from esphomeyaml.components import fan
|
||||
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_OSCILLATION_OUTPUT, CONF_OUTPUT
|
||||
from esphomeyaml.helpers import App, add, get_variable, variable
|
||||
|
||||
PLATFORM_SCHEMA = fan.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
|
||||
vol.Required(CONF_OUTPUT): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_OSCILLATION_OUTPUT): cv.use_variable_id(None),
|
||||
}).extend(fan.FAN_SCHEMA.schema)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -7,7 +7,7 @@ from esphomeyaml.const import CONF_HIGH, CONF_LOW, CONF_MAKE_ID, CONF_MEDIUM, CO
|
||||
CONF_SPEED_STATE_TOPIC
|
||||
from esphomeyaml.helpers import App, add, get_variable, variable
|
||||
|
||||
PLATFORM_SCHEMA = fan.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(fan.FAN_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(fan.MakeFan),
|
||||
vol.Required(CONF_OUTPUT): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_SPEED_STATE_TOPIC): cv.publish_topic,
|
||||
@@ -18,7 +18,7 @@ PLATFORM_SCHEMA = fan.PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_MEDIUM): cv.percentage,
|
||||
vol.Required(CONF_HIGH): cv.percentage,
|
||||
}),
|
||||
}).extend(fan.FAN_SCHEMA.schema)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
@@ -29,7 +29,7 @@ def to_code(config):
|
||||
fan_struct = variable(config[CONF_MAKE_ID], rhs)
|
||||
if CONF_SPEED in config:
|
||||
speeds = config[CONF_SPEED]
|
||||
add(fan_struct.Poutput.set_speed(output, 0.0,
|
||||
add(fan_struct.Poutput.set_speed(output,
|
||||
speeds[CONF_LOW],
|
||||
speeds[CONF_MEDIUM],
|
||||
speeds[CONF_HIGH]))
|
||||
|
||||
121
esphomeyaml/components/font.py
Normal file
121
esphomeyaml/components/font.py
Normal file
@@ -0,0 +1,121 @@
|
||||
# coding=utf-8
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import core
|
||||
from esphomeyaml.components import display
|
||||
from esphomeyaml.const import CONF_FILE, CONF_GLYPHS, CONF_ID, CONF_SIZE
|
||||
from esphomeyaml.core import HexInt
|
||||
from esphomeyaml.helpers import App, ArrayInitializer, MockObj, Pvariable, RawExpression, add, \
|
||||
relative_path
|
||||
|
||||
DEPENDENCIES = ['display']
|
||||
|
||||
Font = display.display_ns.Font
|
||||
Glyph = display.display_ns.Glyph
|
||||
|
||||
|
||||
def validate_glyphs(value):
|
||||
if isinstance(value, list):
|
||||
value = vol.Schema([cv.string])(value)
|
||||
value = vol.Schema([cv.string])(list(value))
|
||||
|
||||
def comparator(x, y):
|
||||
x_ = x.encode('utf-8')
|
||||
y_ = y.encode('utf-8')
|
||||
|
||||
for c in range(min(len(x_), len(y_))):
|
||||
if x_[c] < y_[c]:
|
||||
return -1
|
||||
if x_[c] > y_[c]:
|
||||
return 1
|
||||
|
||||
if len(x_) < len(y_):
|
||||
return -1
|
||||
elif len(x_) > len(y_):
|
||||
return 1
|
||||
else:
|
||||
raise vol.Invalid(u"Found duplicate glyph {}".format(x))
|
||||
|
||||
value.sort(cmp=comparator)
|
||||
return value
|
||||
|
||||
|
||||
def validate_pillow_installed(value):
|
||||
try:
|
||||
import PIL
|
||||
except ImportError:
|
||||
raise vol.Invalid("Please install the pillow python package to use this feature. "
|
||||
"(pip2 install pillow)")
|
||||
|
||||
if PIL.__version__[0] < '4':
|
||||
raise vol.Invalid("Please update your pillow installation to at least 4.0.x. "
|
||||
"(pip2 install -U pillow)")
|
||||
|
||||
return value
|
||||
|
||||
|
||||
def validate_truetype_file(value):
|
||||
if value.endswith('.zip'): # for Google Fonts downloads
|
||||
raise vol.Invalid(u"Please unzip the font archive '{}' first and then use the .ttf files "
|
||||
u"inside.".format(value))
|
||||
if not value.endswith('.ttf'):
|
||||
raise vol.Invalid(u"Only truetype (.ttf) files are supported. Please make sure you're "
|
||||
u"using the correct format or rename the extension to .ttf")
|
||||
return cv.file_(value)
|
||||
|
||||
|
||||
DEFAULT_GLYPHS = u' !"%()+,-.:0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ_abcdefghijklmnopqrstuvwxyz°'
|
||||
CONF_RAW_DATA_ID = 'raw_data_id'
|
||||
|
||||
FONT_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Font),
|
||||
vol.Required(CONF_FILE): validate_truetype_file,
|
||||
vol.Optional(CONF_GLYPHS, default=DEFAULT_GLYPHS): validate_glyphs,
|
||||
vol.Optional(CONF_SIZE, default=20): vol.All(cv.int_, vol.Range(min=1)),
|
||||
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
|
||||
})
|
||||
|
||||
CONFIG_SCHEMA = vol.All(validate_pillow_installed, cv.ensure_list, [FONT_SCHEMA])
|
||||
|
||||
|
||||
def to_code(config):
|
||||
from PIL import ImageFont
|
||||
|
||||
for conf in config:
|
||||
path = relative_path(conf[CONF_FILE])
|
||||
try:
|
||||
font = ImageFont.truetype(path, conf[CONF_SIZE])
|
||||
except Exception as e:
|
||||
raise core.ESPHomeYAMLError(u"Could not load truetype file {}: {}".format(path, e))
|
||||
|
||||
ascent, descent = font.getmetrics()
|
||||
|
||||
glyph_args = {}
|
||||
data = []
|
||||
for glyph in conf[CONF_GLYPHS]:
|
||||
mask = font.getmask(glyph, mode='1')
|
||||
_, (offset_x, offset_y) = font.font.getsize(glyph)
|
||||
width, height = mask.size
|
||||
width8 = ((width + 7) // 8) * 8
|
||||
glyph_data = [0 for _ in range(height * width8 // 8)] # noqa: F812
|
||||
for y in range(height):
|
||||
for x in range(width):
|
||||
if not mask.getpixel((x, y)):
|
||||
continue
|
||||
pos = x + y * width8
|
||||
glyph_data[pos // 8] |= 0x80 >> (pos % 8)
|
||||
glyph_args[glyph] = (len(data), offset_x, offset_y, width, height)
|
||||
data += glyph_data
|
||||
|
||||
raw_data = MockObj(conf[CONF_RAW_DATA_ID])
|
||||
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
|
||||
raw_data, len(data),
|
||||
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
|
||||
|
||||
glyphs = []
|
||||
for glyph in conf[CONF_GLYPHS]:
|
||||
glyphs.append(Glyph(glyph, raw_data, *glyph_args[glyph]))
|
||||
|
||||
rhs = App.make_font(ArrayInitializer(*glyphs), ascent, ascent + descent)
|
||||
Pvariable(conf[CONF_ID], rhs)
|
||||
@@ -12,9 +12,12 @@ CONFIG_SCHEMA = vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(I2CComponent),
|
||||
vol.Required(CONF_SDA, default='SDA'): pins.input_output_pin,
|
||||
vol.Required(CONF_SCL, default='SCL'): pins.input_output_pin,
|
||||
vol.Optional(CONF_FREQUENCY): cv.positive_int,
|
||||
vol.Optional(CONF_RECEIVE_TIMEOUT): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_FREQUENCY): vol.All(cv.frequency, vol.Range(min=0, min_included=False)),
|
||||
vol.Optional(CONF_SCAN): cv.boolean,
|
||||
|
||||
vol.Optional(CONF_RECEIVE_TIMEOUT): cv.invalid("The receive_timeout option has been removed "
|
||||
"because timeouts are already handled by the "
|
||||
"low-level i2c interface.")
|
||||
})
|
||||
|
||||
|
||||
@@ -23,8 +26,6 @@ def to_code(config):
|
||||
i2c = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_FREQUENCY in config:
|
||||
add(i2c.set_frequency(config[CONF_FREQUENCY]))
|
||||
if CONF_RECEIVE_TIMEOUT in config:
|
||||
add(i2c.set_receive_timeout(config[CONF_RECEIVE_TIMEOUT]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_I2C'
|
||||
|
||||
65
esphomeyaml/components/image.py
Normal file
65
esphomeyaml/components/image.py
Normal file
@@ -0,0 +1,65 @@
|
||||
# coding=utf-8
|
||||
import logging
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import core
|
||||
from esphomeyaml.components import display, font
|
||||
from esphomeyaml.const import CONF_FILE, CONF_ID, CONF_RESIZE
|
||||
from esphomeyaml.core import HexInt
|
||||
from esphomeyaml.helpers import App, ArrayInitializer, MockObj, Pvariable, RawExpression, add, \
|
||||
relative_path
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
DEPENDENCIES = ['display']
|
||||
|
||||
Image_ = display.display_ns.Image
|
||||
|
||||
CONF_RAW_DATA_ID = 'raw_data_id'
|
||||
|
||||
IMAGE_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Image_),
|
||||
vol.Required(CONF_FILE): cv.file_,
|
||||
vol.Optional(CONF_RESIZE): cv.dimensions,
|
||||
cv.GenerateID(CONF_RAW_DATA_ID): cv.declare_variable_id(None),
|
||||
})
|
||||
|
||||
CONFIG_SCHEMA = vol.All(font.validate_pillow_installed, cv.ensure_list, [IMAGE_SCHEMA])
|
||||
|
||||
|
||||
def to_code(config):
|
||||
from PIL import Image
|
||||
|
||||
for conf in config:
|
||||
path = relative_path(conf[CONF_FILE])
|
||||
try:
|
||||
image = Image.open(path)
|
||||
except Exception as e:
|
||||
raise core.ESPHomeYAMLError(u"Could not load image file {}: {}".format(path, e))
|
||||
|
||||
if CONF_RESIZE in conf:
|
||||
image.thumbnail(conf[CONF_RESIZE])
|
||||
|
||||
image = image.convert('1', dither=Image.NONE)
|
||||
width, height = image.size
|
||||
if width > 500 or height > 500:
|
||||
_LOGGER.warning("The image you requested is very big. Please consider using the resize "
|
||||
"parameter")
|
||||
width8 = ((width + 7) // 8) * 8
|
||||
data = [0 for _ in range(height * width8 // 8)]
|
||||
for y in range(height):
|
||||
for x in range(width):
|
||||
if image.getpixel((x, y)):
|
||||
continue
|
||||
pos = x + y * width8
|
||||
data[pos // 8] |= 0x80 >> (pos % 8)
|
||||
|
||||
raw_data = MockObj(conf[CONF_RAW_DATA_ID])
|
||||
add(RawExpression('static const uint8_t {}[{}] PROGMEM = {}'.format(
|
||||
raw_data, len(data),
|
||||
ArrayInitializer(*[HexInt(x) for x in data], multiline=False))))
|
||||
|
||||
rhs = App.make_image(raw_data, width, height)
|
||||
Pvariable(conf[CONF_ID], rhs)
|
||||
@@ -1,28 +1,6 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.components import switch
|
||||
from esphomeyaml.const import CONF_CARRIER_DUTY_PERCENT, CONF_ID, CONF_PIN
|
||||
from esphomeyaml.helpers import App, Pvariable, gpio_output_pin_expression
|
||||
|
||||
IRTransmitterComponent = switch.switch_ns.namespace('IRTransmitterComponent')
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(IRTransmitterComponent),
|
||||
vol.Required(CONF_PIN): pins.GPIO_OUTPUT_PIN_SCHEMA,
|
||||
vol.Optional(CONF_CARRIER_DUTY_PERCENT): vol.All(vol.Coerce(int),
|
||||
vol.Range(min=1, max=100)),
|
||||
})])
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
pin = None
|
||||
for pin in gpio_output_pin_expression(conf[CONF_PIN]):
|
||||
yield
|
||||
rhs = App.make_ir_transmitter(pin, conf.get(CONF_CARRIER_DUTY_PERCENT))
|
||||
Pvariable(conf[CONF_ID], rhs)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_IR_TRANSMITTER'
|
||||
def CONFIG_SCHEMA(config):
|
||||
raise vol.Invalid("The ir_transmitter component has been renamed to "
|
||||
"remote_transmitter because of 433MHz signal support.")
|
||||
|
||||
@@ -1,7 +1,16 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, CONF_ID, \
|
||||
CONF_MQTT_ID
|
||||
from esphomeyaml.helpers import Application, Pvariable, add, esphomelib_ns, setup_mqtt_component
|
||||
from esphomeyaml.const import CONF_ALPHA, CONF_BLUE, CONF_BRIGHTNESS, CONF_COLORS, \
|
||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_DURATION, CONF_EFFECTS, CONF_EFFECT_ID, \
|
||||
CONF_GAMMA_CORRECT, CONF_GREEN, CONF_ID, CONF_INTERNAL, CONF_LAMBDA, CONF_MQTT_ID, CONF_NAME, \
|
||||
CONF_NUM_LEDS, CONF_RANDOM, CONF_RED, CONF_SPEED, CONF_STATE, CONF_TRANSITION_LENGTH, \
|
||||
CONF_UPDATE_INTERVAL, CONF_WHITE, CONF_WIDTH, CONF_FLASH_LENGTH, CONF_COLOR_TEMPERATURE, \
|
||||
CONF_EFFECT
|
||||
from esphomeyaml.helpers import Application, ArrayInitializer, Pvariable, RawExpression, \
|
||||
StructInitializer, add, add_job, esphomelib_ns, process_lambda, setup_mqtt_component, \
|
||||
get_variable, TemplateArguments, templatable, uint32, float_, std_string
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
@@ -9,23 +18,317 @@ PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
light_ns = esphomelib_ns.namespace('light')
|
||||
LightState = light_ns.LightState
|
||||
LightColorValues = light_ns.LightColorValues
|
||||
MQTTJSONLightComponent = light_ns.MQTTJSONLightComponent
|
||||
ToggleAction = light_ns.ToggleAction
|
||||
TurnOffAction = light_ns.TurnOffAction
|
||||
TurnOnAction = light_ns.TurnOnAction
|
||||
MakeLight = Application.MakeLight
|
||||
RandomLightEffect = light_ns.RandomLightEffect
|
||||
LambdaLightEffect = light_ns.LambdaLightEffect
|
||||
StrobeLightEffect = light_ns.StrobeLightEffect
|
||||
StrobeLightEffectColor = light_ns.StrobeLightEffectColor
|
||||
FlickerLightEffect = light_ns.FlickerLightEffect
|
||||
FastLEDLambdaLightEffect = light_ns.FastLEDLambdaLightEffect
|
||||
FastLEDRainbowLightEffect = light_ns.FastLEDRainbowLightEffect
|
||||
FastLEDColorWipeEffect = light_ns.FastLEDColorWipeEffect
|
||||
FastLEDColorWipeEffectColor = light_ns.FastLEDColorWipeEffectColor
|
||||
FastLEDScanEffect = light_ns.FastLEDScanEffect
|
||||
FastLEDScanEffectColor = light_ns.FastLEDScanEffectColor
|
||||
FastLEDTwinkleEffect = light_ns.FastLEDTwinkleEffect
|
||||
FastLEDRandomTwinkleEffect = light_ns.FastLEDRandomTwinkleEffect
|
||||
FastLEDFireworksEffect = light_ns.FastLEDFireworksEffect
|
||||
FastLEDFlickerEffect = light_ns.FastLEDFlickerEffect
|
||||
FastLEDLightOutputComponent = light_ns.FastLEDLightOutputComponent
|
||||
|
||||
CONF_STROBE = 'strobe'
|
||||
CONF_FLICKER = 'flicker'
|
||||
CONF_FASTLED_LAMBDA = 'fastled_lambda'
|
||||
CONF_FASTLED_RAINBOW = 'fastled_rainbow'
|
||||
CONF_FASTLED_COLOR_WIPE = 'fastled_color_wipe'
|
||||
CONF_FASTLED_SCAN = 'fastled_scan'
|
||||
CONF_FASTLED_TWINKLE = 'fastled_twinkle'
|
||||
CONF_FASTLED_RANDOM_TWINKLE = 'fastled_random_twinkle'
|
||||
CONF_FASTLED_FIREWORKS = 'fastled_fireworks'
|
||||
CONF_FASTLED_FLICKER = 'fastled_flicker'
|
||||
|
||||
CONF_ADD_LED_INTERVAL = 'add_led_interval'
|
||||
CONF_REVERSE = 'reverse'
|
||||
CONF_MOVE_INTERVAL = 'move_interval'
|
||||
CONF_TWINKLE_PROBABILITY = 'twinkle_probability'
|
||||
CONF_PROGRESS_INTERVAL = 'progress_interval'
|
||||
CONF_SPARK_PROBABILITY = 'spark_probability'
|
||||
CONF_USE_RANDOM_COLOR = 'use_random_color'
|
||||
CONF_FADE_OUT_RATE = 'fade_out_rate'
|
||||
CONF_INTENSITY = 'intensity'
|
||||
|
||||
BINARY_EFFECTS = [CONF_LAMBDA, CONF_STROBE]
|
||||
MONOCHROMATIC_EFFECTS = BINARY_EFFECTS + [CONF_FLICKER]
|
||||
RGB_EFFECTS = MONOCHROMATIC_EFFECTS + [CONF_RANDOM]
|
||||
FASTLED_EFFECTS = RGB_EFFECTS + [CONF_FASTLED_LAMBDA, CONF_FASTLED_RAINBOW, CONF_FASTLED_COLOR_WIPE,
|
||||
CONF_FASTLED_SCAN, CONF_FASTLED_TWINKLE,
|
||||
CONF_FASTLED_RANDOM_TWINKLE, CONF_FASTLED_FIREWORKS,
|
||||
CONF_FASTLED_FLICKER]
|
||||
|
||||
EFFECTS_SCHEMA = vol.Schema({
|
||||
vol.Optional(CONF_LAMBDA): vol.Schema({
|
||||
vol.Required(CONF_NAME): cv.string,
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL, default='0ms'): cv.positive_time_period_milliseconds,
|
||||
}),
|
||||
vol.Optional(CONF_RANDOM): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(RandomLightEffect),
|
||||
vol.Optional(CONF_NAME, default="Random"): cv.string,
|
||||
vol.Optional(CONF_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}),
|
||||
vol.Optional(CONF_STROBE): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(StrobeLightEffect),
|
||||
vol.Optional(CONF_NAME, default="Strobe"): cv.string,
|
||||
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list, [vol.All(vol.Schema({
|
||||
vol.Optional(CONF_STATE, default=True): cv.boolean,
|
||||
vol.Optional(CONF_BRIGHTNESS, default=1.0): cv.percentage,
|
||||
vol.Optional(CONF_RED, default=1.0): cv.percentage,
|
||||
vol.Optional(CONF_GREEN, default=1.0): cv.percentage,
|
||||
vol.Optional(CONF_BLUE, default=1.0): cv.percentage,
|
||||
vol.Optional(CONF_WHITE, default=1.0): cv.percentage,
|
||||
vol.Required(CONF_DURATION): cv.positive_time_period_milliseconds,
|
||||
}), cv.has_at_least_one_key(CONF_STATE, CONF_BRIGHTNESS, CONF_RED, CONF_GREEN, CONF_BLUE,
|
||||
CONF_WHITE))], vol.Length(min=2)),
|
||||
}),
|
||||
vol.Optional(CONF_FLICKER): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FlickerLightEffect),
|
||||
vol.Optional(CONF_NAME, default="Flicker"): cv.string,
|
||||
vol.Optional(CONF_ALPHA): cv.percentage,
|
||||
vol.Optional(CONF_INTENSITY): cv.percentage,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_LAMBDA): vol.Schema({
|
||||
vol.Required(CONF_NAME): cv.string,
|
||||
vol.Required(CONF_LAMBDA): cv.lambda_,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL, default='0ms'): cv.positive_time_period_milliseconds,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_RAINBOW): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDRainbowLightEffect),
|
||||
vol.Optional(CONF_NAME, default="Rainbow"): cv.string,
|
||||
vol.Optional(CONF_SPEED): cv.uint32_t,
|
||||
vol.Optional(CONF_WIDTH): cv.uint32_t,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_COLOR_WIPE): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDColorWipeEffect),
|
||||
vol.Optional(CONF_NAME, default="Color Wipe"): cv.string,
|
||||
vol.Optional(CONF_COLORS): vol.All(cv.ensure_list, [vol.Schema({
|
||||
vol.Optional(CONF_RED, default=1.0): cv.percentage,
|
||||
vol.Optional(CONF_GREEN, default=1.0): cv.percentage,
|
||||
vol.Optional(CONF_BLUE, default=1.0): cv.percentage,
|
||||
vol.Optional(CONF_RANDOM, default=False): cv.boolean,
|
||||
vol.Required(CONF_NUM_LEDS): vol.All(cv.uint32_t, vol.Range(min=1)),
|
||||
})]),
|
||||
vol.Optional(CONF_ADD_LED_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_REVERSE): cv.boolean,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_SCAN): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDScanEffect),
|
||||
vol.Optional(CONF_NAME, default="Scan"): cv.string,
|
||||
vol.Optional(CONF_MOVE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_TWINKLE): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDTwinkleEffect),
|
||||
vol.Optional(CONF_NAME, default="Twinkle"): cv.string,
|
||||
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
|
||||
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_RANDOM_TWINKLE): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDRandomTwinkleEffect),
|
||||
vol.Optional(CONF_NAME, default="Random Twinkle"): cv.string,
|
||||
vol.Optional(CONF_TWINKLE_PROBABILITY): cv.percentage,
|
||||
vol.Optional(CONF_PROGRESS_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_FIREWORKS): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDFireworksEffect),
|
||||
vol.Optional(CONF_NAME, default="Fireworks"): cv.string,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_SPARK_PROBABILITY): cv.percentage,
|
||||
vol.Optional(CONF_USE_RANDOM_COLOR): cv.boolean,
|
||||
vol.Optional(CONF_FADE_OUT_RATE): cv.uint8_t,
|
||||
}),
|
||||
vol.Optional(CONF_FASTLED_FLICKER): vol.Schema({
|
||||
cv.GenerateID(CONF_EFFECT_ID): cv.declare_variable_id(FastLEDFlickerEffect),
|
||||
vol.Optional(CONF_NAME, default="FastLED Flicker"): cv.string,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_INTENSITY): cv.percentage,
|
||||
}),
|
||||
})
|
||||
|
||||
|
||||
def validate_effects(allowed_effects):
|
||||
def validator(value):
|
||||
value = cv.ensure_list(value)
|
||||
names = set()
|
||||
ret = []
|
||||
for i, effect in enumerate(value):
|
||||
if not isinstance(effect, dict):
|
||||
raise vol.Invalid("Each effect must be a dictionary, not {}".format(type(value)))
|
||||
if len(effect) > 1:
|
||||
raise vol.Invalid("Each entry in the 'effects:' option must be a single effect.")
|
||||
if not effect:
|
||||
raise vol.Invalid("Found no effect for the {}th entry in 'effects:'!".format(i))
|
||||
key = next(iter(effect.keys()))
|
||||
if key not in allowed_effects:
|
||||
raise vol.Invalid("The effect '{}' does not exist or is not allowed for this "
|
||||
"light type".format(key))
|
||||
effect[key] = effect[key] or {}
|
||||
conf = EFFECTS_SCHEMA(effect)
|
||||
name = conf[key][CONF_NAME]
|
||||
if name in names:
|
||||
raise vol.Invalid(u"Found the effect name '{}' twice. All effects must have "
|
||||
u"unique names".format(name))
|
||||
names.add(name)
|
||||
ret.append(conf)
|
||||
return ret
|
||||
|
||||
return validator
|
||||
|
||||
|
||||
LIGHT_SCHEMA = cv.MQTT_COMMAND_COMPONENT_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(LightState),
|
||||
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTJSONLightComponent),
|
||||
})
|
||||
|
||||
LIGHT_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(LIGHT_SCHEMA.schema)
|
||||
|
||||
|
||||
def build_effect(full_config):
|
||||
key, config = next(iter(full_config.items()))
|
||||
if key == CONF_LAMBDA:
|
||||
lambda_ = None
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], []):
|
||||
yield None
|
||||
yield LambdaLightEffect.new(config[CONF_NAME], lambda_, config[CONF_UPDATE_INTERVAL])
|
||||
elif key == CONF_RANDOM:
|
||||
rhs = RandomLightEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_TRANSITION_LENGTH in config:
|
||||
add(effect.set_transition_length(config[CONF_TRANSITION_LENGTH]))
|
||||
if CONF_UPDATE_INTERVAL in config:
|
||||
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
||||
yield effect
|
||||
elif key == CONF_STROBE:
|
||||
rhs = StrobeLightEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
colors = []
|
||||
for color in config.get(CONF_COLORS, []):
|
||||
colors.append(StructInitializer(
|
||||
StrobeLightEffectColor,
|
||||
('color', LightColorValues(color[CONF_STATE], color[CONF_BRIGHTNESS],
|
||||
color[CONF_RED], color[CONF_GREEN], color[CONF_BLUE],
|
||||
color[CONF_WHITE])),
|
||||
('duration', color[CONF_DURATION]),
|
||||
))
|
||||
if colors:
|
||||
add(effect.set_colors(ArrayInitializer(*colors)))
|
||||
yield effect
|
||||
elif key == CONF_FLICKER:
|
||||
rhs = FlickerLightEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_ALPHA in config:
|
||||
add(effect.set_alpha(config[CONF_ALPHA]))
|
||||
if CONF_INTENSITY in config:
|
||||
add(effect.set_intensity(config[CONF_INTENSITY]))
|
||||
yield effect
|
||||
elif key == CONF_FASTLED_LAMBDA:
|
||||
lambda_ = None
|
||||
args = [(RawExpression('FastLEDLightOutputComponent &'), 'it')]
|
||||
for lambda_ in process_lambda(config[CONF_LAMBDA], args):
|
||||
yield None
|
||||
yield FastLEDLambdaLightEffect.new(config[CONF_NAME], lambda_, config[CONF_UPDATE_INTERVAL])
|
||||
elif key == CONF_FASTLED_RAINBOW:
|
||||
rhs = FastLEDRainbowLightEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_SPEED in config:
|
||||
add(effect.set_speed(config[CONF_SPEED]))
|
||||
if CONF_WIDTH in config:
|
||||
add(effect.set_width(config[CONF_WIDTH]))
|
||||
yield effect
|
||||
elif key == CONF_FASTLED_COLOR_WIPE:
|
||||
rhs = FastLEDColorWipeEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_ADD_LED_INTERVAL in config:
|
||||
add(effect.set_add_led_interval(config[CONF_ADD_LED_INTERVAL]))
|
||||
if CONF_REVERSE in config:
|
||||
add(effect.set_reverse(config[CONF_REVERSE]))
|
||||
colors = []
|
||||
for color in config.get(CONF_COLORS, []):
|
||||
colors.append(StructInitializer(
|
||||
FastLEDColorWipeEffectColor,
|
||||
('r', color[CONF_RED]),
|
||||
('g', color[CONF_GREEN]),
|
||||
('b', color[CONF_BLUE]),
|
||||
('random', color[CONF_RANDOM]),
|
||||
('num_leds', color[CONF_NUM_LEDS]),
|
||||
))
|
||||
if colors:
|
||||
add(effect.set_colors(ArrayInitializer(*colors)))
|
||||
yield effect
|
||||
elif key == CONF_FASTLED_SCAN:
|
||||
rhs = FastLEDScanEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_MOVE_INTERVAL in config:
|
||||
add(effect.set_move_interval(config[CONF_MOVE_INTERVAL]))
|
||||
yield effect
|
||||
elif key == CONF_FASTLED_TWINKLE:
|
||||
rhs = FastLEDTwinkleEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_TWINKLE_PROBABILITY in config:
|
||||
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
|
||||
if CONF_PROGRESS_INTERVAL in config:
|
||||
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
|
||||
yield effect
|
||||
elif key == CONF_FASTLED_RANDOM_TWINKLE:
|
||||
rhs = FastLEDRandomTwinkleEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_TWINKLE_PROBABILITY in config:
|
||||
add(effect.set_twinkle_probability(config[CONF_TWINKLE_PROBABILITY]))
|
||||
if CONF_PROGRESS_INTERVAL in config:
|
||||
add(effect.set_progress_interval(config[CONF_PROGRESS_INTERVAL]))
|
||||
yield effect
|
||||
elif key == CONF_FASTLED_FIREWORKS:
|
||||
rhs = FastLEDFireworksEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_UPDATE_INTERVAL in config:
|
||||
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
||||
if CONF_SPARK_PROBABILITY in config:
|
||||
add(effect.set_spark_probability(config[CONF_SPARK_PROBABILITY]))
|
||||
if CONF_USE_RANDOM_COLOR in config:
|
||||
add(effect.set_spark_probability(config[CONF_USE_RANDOM_COLOR]))
|
||||
if CONF_FADE_OUT_RATE in config:
|
||||
add(effect.set_spark_probability(config[CONF_FADE_OUT_RATE]))
|
||||
yield effect
|
||||
elif key == CONF_FASTLED_FLICKER:
|
||||
rhs = FastLEDFlickerEffect.new(config[CONF_NAME])
|
||||
effect = Pvariable(config[CONF_EFFECT_ID], rhs)
|
||||
if CONF_UPDATE_INTERVAL in config:
|
||||
add(effect.set_update_interval(config[CONF_UPDATE_INTERVAL]))
|
||||
if CONF_INTENSITY in config:
|
||||
add(effect.set_intensity(config[CONF_INTENSITY]))
|
||||
yield effect
|
||||
else:
|
||||
raise NotImplementedError("Effect {} not implemented".format(next(config.keys())))
|
||||
|
||||
|
||||
def setup_light_core_(light_var, mqtt_var, config):
|
||||
if CONF_INTERNAL in config:
|
||||
add(light_var.set_internal(config[CONF_INTERNAL]))
|
||||
if CONF_DEFAULT_TRANSITION_LENGTH in config:
|
||||
add(light_var.set_default_transition_length(config[CONF_DEFAULT_TRANSITION_LENGTH]))
|
||||
if CONF_GAMMA_CORRECT in config:
|
||||
add(light_var.set_gamma_correct(config[CONF_GAMMA_CORRECT]))
|
||||
effects = []
|
||||
for conf in config.get(CONF_EFFECTS, []):
|
||||
for effect in build_effect(conf):
|
||||
yield
|
||||
effects.append(effect)
|
||||
if effects:
|
||||
add(light_var.add_effects(ArrayInitializer(*effects)))
|
||||
|
||||
setup_mqtt_component(mqtt_var, config)
|
||||
|
||||
@@ -33,7 +336,115 @@ def setup_light_core_(light_var, mqtt_var, config):
|
||||
def setup_light(light_obj, mqtt_obj, config):
|
||||
light_var = Pvariable(config[CONF_ID], light_obj, has_side_effects=False)
|
||||
mqtt_var = Pvariable(config[CONF_MQTT_ID], mqtt_obj, has_side_effects=False)
|
||||
setup_light_core_(light_var, mqtt_var, config)
|
||||
add_job(setup_light_core_, light_var, mqtt_var, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_LIGHT'
|
||||
|
||||
|
||||
CONF_LIGHT_TOGGLE = 'light.toggle'
|
||||
LIGHT_TOGGLE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_LIGHT_TOGGLE, LIGHT_TOGGLE_ACTION_SCHEMA)
|
||||
def light_toggle_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_toggle_action(template_arg)
|
||||
type = ToggleAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
if CONF_TRANSITION_LENGTH in config:
|
||||
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_transition_length(template_))
|
||||
yield action
|
||||
|
||||
|
||||
CONF_LIGHT_TURN_OFF = 'light.turn_off'
|
||||
LIGHT_TURN_OFF_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_TRANSITION_LENGTH): cv.templatable(cv.positive_time_period_milliseconds),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_OFF, LIGHT_TURN_OFF_ACTION_SCHEMA)
|
||||
def light_turn_off_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_off_action(template_arg)
|
||||
type = TurnOffAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
if CONF_TRANSITION_LENGTH in config:
|
||||
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_transition_length(template_))
|
||||
yield action
|
||||
|
||||
|
||||
CONF_LIGHT_TURN_ON = 'light.turn_on'
|
||||
LIGHT_TURN_ON_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Exclusive(CONF_TRANSITION_LENGTH, 'transformer'):
|
||||
cv.templatable(cv.positive_time_period_milliseconds),
|
||||
vol.Exclusive(CONF_FLASH_LENGTH, 'transformer'):
|
||||
cv.templatable(cv.positive_time_period_milliseconds),
|
||||
vol.Optional(CONF_BRIGHTNESS): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_RED): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_GREEN): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_BLUE): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_WHITE): cv.templatable(cv.percentage),
|
||||
vol.Optional(CONF_COLOR_TEMPERATURE): cv.templatable(cv.positive_float),
|
||||
vol.Optional(CONF_EFFECT): cv.templatable(cv.string),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_LIGHT_TURN_ON, LIGHT_TURN_ON_ACTION_SCHEMA)
|
||||
def light_turn_on_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_on_action(template_arg)
|
||||
type = TurnOnAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
if CONF_TRANSITION_LENGTH in config:
|
||||
for template_ in templatable(config[CONF_TRANSITION_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_transition_length(template_))
|
||||
if CONF_FLASH_LENGTH in config:
|
||||
for template_ in templatable(config[CONF_FLASH_LENGTH], arg_type, uint32):
|
||||
yield None
|
||||
add(action.set_flash_length(template_))
|
||||
if CONF_BRIGHTNESS in config:
|
||||
for template_ in templatable(config[CONF_BRIGHTNESS], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_brightness(template_))
|
||||
if CONF_RED in config:
|
||||
for template_ in templatable(config[CONF_RED], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_red(template_))
|
||||
if CONF_GREEN in config:
|
||||
for template_ in templatable(config[CONF_GREEN], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_green(template_))
|
||||
if CONF_BLUE in config:
|
||||
for template_ in templatable(config[CONF_BLUE], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_blue(template_))
|
||||
if CONF_WHITE in config:
|
||||
for template_ in templatable(config[CONF_WHITE], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_white(template_))
|
||||
if CONF_COLOR_TEMPERATURE in config:
|
||||
for template_ in templatable(config[CONF_COLOR_TEMPERATURE], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_color_temperature(template_))
|
||||
if CONF_EFFECT in config:
|
||||
for template_ in templatable(config[CONF_EFFECT], arg_type, std_string):
|
||||
yield None
|
||||
add(action.set_effect(template_))
|
||||
yield action
|
||||
|
||||
@@ -2,13 +2,14 @@ import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import light
|
||||
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT
|
||||
from esphomeyaml.const import CONF_MAKE_ID, CONF_NAME, CONF_OUTPUT, CONF_EFFECTS
|
||||
from esphomeyaml.helpers import App, get_variable, variable
|
||||
|
||||
PLATFORM_SCHEMA = light.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||
vol.Required(CONF_OUTPUT): cv.use_variable_id(None),
|
||||
}).extend(light.LIGHT_SCHEMA.schema)
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.BINARY_EFFECTS),
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
34
esphomeyaml/components/light/cwww.py
Normal file
34
esphomeyaml/components/light/cwww.py
Normal file
@@ -0,0 +1,34 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import light
|
||||
from esphomeyaml.components.light.rgbww import validate_cold_white_colder, \
|
||||
validate_color_temperature
|
||||
from esphomeyaml.const import CONF_COLD_WHITE, CONF_COLD_WHITE_COLOR_TEMPERATURE, \
|
||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_MAKE_ID, \
|
||||
CONF_NAME, CONF_WARM_WHITE, CONF_WARM_WHITE_COLOR_TEMPERATURE
|
||||
from esphomeyaml.helpers import App, get_variable, variable
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||
vol.Required(CONF_COLD_WHITE): cv.use_variable_id(None),
|
||||
vol.Required(CONF_WARM_WHITE): cv.use_variable_id(None),
|
||||
vol.Required(CONF_COLD_WHITE_COLOR_TEMPERATURE): validate_color_temperature,
|
||||
vol.Required(CONF_WARM_WHITE_COLOR_TEMPERATURE): validate_color_temperature,
|
||||
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.MONOCHROMATIC_EFFECTS),
|
||||
}), validate_cold_white_colder)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for cold_white in get_variable(config[CONF_COLD_WHITE]):
|
||||
yield
|
||||
for warm_white in get_variable(config[CONF_WARM_WHITE]):
|
||||
yield
|
||||
rhs = App.make_cwww_light(config[CONF_NAME], config[CONF_COLD_WHITE_COLOR_TEMPERATURE],
|
||||
config[CONF_WARM_WHITE_COLOR_TEMPERATURE],
|
||||
cold_white, warm_white)
|
||||
light_struct = variable(config[CONF_MAKE_ID], rhs)
|
||||
light.setup_light(light_struct.Pstate, light_struct.Pmqtt, config)
|
||||
@@ -6,7 +6,7 @@ from esphomeyaml.components import light
|
||||
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
||||
from esphomeyaml.const import CONF_CHIPSET, CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, \
|
||||
CONF_MAKE_ID, CONF_MAX_REFRESH_RATE, CONF_NAME, CONF_NUM_LEDS, CONF_PIN, CONF_POWER_SUPPLY, \
|
||||
CONF_RGB_ORDER
|
||||
CONF_RGB_ORDER, CONF_EFFECTS, CONF_COLOR_CORRECT
|
||||
from esphomeyaml.helpers import App, Application, RawExpression, TemplateArguments, add, \
|
||||
get_variable, variable
|
||||
|
||||
@@ -55,7 +55,7 @@ def validate(value):
|
||||
|
||||
MakeFastLEDLight = Application.MakeFastLEDLight
|
||||
|
||||
PLATFORM_SCHEMA = vol.All(light.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeFastLEDLight),
|
||||
|
||||
vol.Required(CONF_CHIPSET): vol.All(vol.Upper, cv.one_of(*TYPES)),
|
||||
@@ -66,9 +66,11 @@ PLATFORM_SCHEMA = vol.All(light.PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_RGB_ORDER): vol.All(vol.Upper, cv.one_of(*RGB_ORDERS)),
|
||||
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=3)),
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
||||
}).extend(light.LIGHT_SCHEMA.schema), validate)
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.FASTLED_EFFECTS),
|
||||
}), validate)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
@@ -92,6 +94,10 @@ def to_code(config):
|
||||
yield
|
||||
add(fast_led.set_power_supply(power_supply))
|
||||
|
||||
if CONF_COLOR_CORRECT in config:
|
||||
r, g, b = config[CONF_COLOR_CORRECT]
|
||||
add(fast_led.set_correction(r, g, b))
|
||||
|
||||
light.setup_light(make.Pstate, make.Pmqtt, config)
|
||||
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@ from esphomeyaml.components import light
|
||||
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
||||
from esphomeyaml.const import CONF_CHIPSET, CONF_CLOCK_PIN, CONF_DATA_PIN, \
|
||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, CONF_MAKE_ID, CONF_MAX_REFRESH_RATE, \
|
||||
CONF_NAME, CONF_NUM_LEDS, CONF_POWER_SUPPLY, CONF_RGB_ORDER
|
||||
CONF_NAME, CONF_NUM_LEDS, CONF_POWER_SUPPLY, CONF_RGB_ORDER, CONF_EFFECTS, CONF_COLOR_CORRECT
|
||||
from esphomeyaml.helpers import App, Application, RawExpression, TemplateArguments, add, \
|
||||
get_variable, variable
|
||||
|
||||
@@ -32,7 +32,7 @@ RGB_ORDERS = [
|
||||
|
||||
MakeFastLEDLight = Application.MakeFastLEDLight
|
||||
|
||||
PLATFORM_SCHEMA = light.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeFastLEDLight),
|
||||
|
||||
vol.Required(CONF_CHIPSET): vol.All(vol.Upper, cv.one_of(*CHIPSETS)),
|
||||
@@ -44,9 +44,11 @@ PLATFORM_SCHEMA = light.PLATFORM_SCHEMA.extend({
|
||||
vol.Optional(CONF_MAX_REFRESH_RATE): cv.positive_time_period_microseconds,
|
||||
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_COLOR_CORRECT): vol.All([cv.percentage], vol.Length(min=3, max=3)),
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_POWER_SUPPLY): cv.use_variable_id(PowerSupplyComponent),
|
||||
}).extend(light.LIGHT_SCHEMA.schema)
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.FASTLED_EFFECTS),
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
@@ -72,7 +74,11 @@ def to_code(config):
|
||||
yield
|
||||
add(fast_led.set_power_supply(power_supply))
|
||||
|
||||
light.setup_light(make.Pstate, make.Pmqtt, config)
|
||||
if CONF_COLOR_CORRECT in config:
|
||||
r, g, b = config[CONF_COLOR_CORRECT]
|
||||
add(fast_led.set_correction(r, g, b))
|
||||
|
||||
light.setup_light(make.Pstate, make.Pmqtt, config)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_FAST_LED_LIGHT'
|
||||
|
||||
@@ -3,15 +3,16 @@ import voluptuous as vol
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import light
|
||||
from esphomeyaml.const import CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, CONF_MAKE_ID, \
|
||||
CONF_NAME, CONF_OUTPUT
|
||||
CONF_NAME, CONF_OUTPUT, CONF_EFFECTS
|
||||
from esphomeyaml.helpers import App, get_variable, variable
|
||||
|
||||
PLATFORM_SCHEMA = light.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||
vol.Required(CONF_OUTPUT): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
}).extend(light.LIGHT_SCHEMA.schema)
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.MONOCHROMATIC_EFFECTS),
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -3,17 +3,18 @@ import voluptuous as vol
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import light
|
||||
from esphomeyaml.const import CONF_BLUE, CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, \
|
||||
CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED
|
||||
CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED, CONF_EFFECTS
|
||||
from esphomeyaml.helpers import App, get_variable, variable
|
||||
|
||||
PLATFORM_SCHEMA = light.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||
vol.Required(CONF_RED): cv.use_variable_id(None),
|
||||
vol.Required(CONF_GREEN): cv.use_variable_id(None),
|
||||
vol.Required(CONF_BLUE): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
}).extend(light.LIGHT_SCHEMA.schema)
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.RGB_EFFECTS),
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -3,10 +3,10 @@ import voluptuous as vol
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import light
|
||||
from esphomeyaml.const import CONF_BLUE, CONF_DEFAULT_TRANSITION_LENGTH, CONF_GAMMA_CORRECT, \
|
||||
CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED, CONF_WHITE
|
||||
CONF_GREEN, CONF_MAKE_ID, CONF_NAME, CONF_RED, CONF_WHITE, CONF_EFFECTS
|
||||
from esphomeyaml.helpers import App, get_variable, variable
|
||||
|
||||
PLATFORM_SCHEMA = light.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||
vol.Required(CONF_RED): cv.use_variable_id(None),
|
||||
vol.Required(CONF_GREEN): cv.use_variable_id(None),
|
||||
@@ -14,7 +14,8 @@ PLATFORM_SCHEMA = light.PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_WHITE): cv.use_variable_id(None),
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
}).extend(light.LIGHT_SCHEMA.schema)
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.RGB_EFFECTS),
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
62
esphomeyaml/components/light/rgbww.py
Normal file
62
esphomeyaml/components/light/rgbww.py
Normal file
@@ -0,0 +1,62 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import light
|
||||
from esphomeyaml.const import CONF_BLUE, CONF_COLD_WHITE, CONF_COLD_WHITE_COLOR_TEMPERATURE, \
|
||||
CONF_DEFAULT_TRANSITION_LENGTH, CONF_EFFECTS, CONF_GAMMA_CORRECT, CONF_GREEN, CONF_MAKE_ID, \
|
||||
CONF_NAME, CONF_RED, CONF_WARM_WHITE, CONF_WARM_WHITE_COLOR_TEMPERATURE
|
||||
from esphomeyaml.helpers import App, get_variable, variable
|
||||
|
||||
|
||||
def validate_color_temperature(value):
|
||||
try:
|
||||
val = cv.float_with_unit('Color Temperature', 'mireds')(value)
|
||||
except vol.Invalid:
|
||||
val = 1000000.0 / cv.float_with_unit('Color Temperature', 'K')(value)
|
||||
if val < 0:
|
||||
raise vol.Invalid("Color temperature cannot be negative")
|
||||
return val
|
||||
|
||||
|
||||
def validate_cold_white_colder(value):
|
||||
cw = value[CONF_COLD_WHITE_COLOR_TEMPERATURE]
|
||||
ww = value[CONF_WARM_WHITE_COLOR_TEMPERATURE]
|
||||
if cw > ww:
|
||||
raise vol.Invalid("Cold white color temperature cannot be higher than warm white")
|
||||
if cw == ww:
|
||||
raise vol.Invalid("Cold white color temperature cannot be the same as warm white")
|
||||
return value
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(light.LIGHT_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(light.MakeLight),
|
||||
vol.Required(CONF_RED): cv.use_variable_id(None),
|
||||
vol.Required(CONF_GREEN): cv.use_variable_id(None),
|
||||
vol.Required(CONF_BLUE): cv.use_variable_id(None),
|
||||
vol.Required(CONF_COLD_WHITE): cv.use_variable_id(None),
|
||||
vol.Required(CONF_WARM_WHITE): cv.use_variable_id(None),
|
||||
vol.Required(CONF_COLD_WHITE_COLOR_TEMPERATURE): validate_color_temperature,
|
||||
vol.Required(CONF_WARM_WHITE_COLOR_TEMPERATURE): validate_color_temperature,
|
||||
|
||||
vol.Optional(CONF_GAMMA_CORRECT): cv.positive_float,
|
||||
vol.Optional(CONF_DEFAULT_TRANSITION_LENGTH): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_EFFECTS): light.validate_effects(light.RGB_EFFECTS),
|
||||
}), validate_cold_white_colder)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for red in get_variable(config[CONF_RED]):
|
||||
yield
|
||||
for green in get_variable(config[CONF_GREEN]):
|
||||
yield
|
||||
for blue in get_variable(config[CONF_BLUE]):
|
||||
yield
|
||||
for cold_white in get_variable(config[CONF_COLD_WHITE]):
|
||||
yield
|
||||
for warm_white in get_variable(config[CONF_WARM_WHITE]):
|
||||
yield
|
||||
rhs = App.make_rgbww_light(config[CONF_NAME], config[CONF_COLD_WHITE_COLOR_TEMPERATURE],
|
||||
config[CONF_WARM_WHITE_COLOR_TEMPERATURE],
|
||||
red, green, blue, cold_white, warm_white)
|
||||
light_struct = variable(config[CONF_MAKE_ID], rhs)
|
||||
light.setup_light(light_struct.Pstate, light_struct.Pmqtt, config)
|
||||
@@ -1,10 +1,14 @@
|
||||
import re
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml.automation import ACTION_REGISTRY, LambdaAction
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_BAUD_RATE, CONF_ID, CONF_LEVEL, CONF_LOGS, \
|
||||
CONF_TX_BUFFER_SIZE
|
||||
from esphomeyaml.core import ESPHomeYAMLError
|
||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, global_ns
|
||||
from esphomeyaml.const import CONF_ARGS, CONF_BAUD_RATE, CONF_FORMAT, CONF_ID, CONF_LEVEL, \
|
||||
CONF_LOGS, CONF_TAG, CONF_TX_BUFFER_SIZE
|
||||
from esphomeyaml.core import ESPHomeYAMLError, Lambda
|
||||
from esphomeyaml.helpers import App, Pvariable, TemplateArguments, add, esphomelib_ns, global_ns, \
|
||||
process_lambda, RawExpression, statement
|
||||
|
||||
LOG_LEVELS = {
|
||||
'NONE': global_ns.ESPHOMELIB_LOG_LEVEL_NONE,
|
||||
@@ -16,6 +20,15 @@ LOG_LEVELS = {
|
||||
'VERY_VERBOSE': global_ns.ESPHOMELIB_LOG_LEVEL_VERY_VERBOSE,
|
||||
}
|
||||
|
||||
LOG_LEVEL_TO_ESP_LOG = {
|
||||
'ERROR': global_ns.ESP_LOGE,
|
||||
'WARN': global_ns.ESP_LOGW,
|
||||
'INFO': global_ns.ESP_LOGI,
|
||||
'DEBUG': global_ns.ESP_LOGD,
|
||||
'VERBOSE': global_ns.ESP_LOGV,
|
||||
'VERY_VERBOSE': global_ns.ESP_LOGVV,
|
||||
}
|
||||
|
||||
LOG_LEVEL_SEVERITY = ['NONE', 'ERROR', 'WARN', 'INFO', 'DEBUG', 'VERBOSE', 'VERY_VERBOSE']
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
@@ -35,8 +48,8 @@ LogComponent = esphomelib_ns.LogComponent
|
||||
|
||||
CONFIG_SCHEMA = vol.All(vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(LogComponent),
|
||||
vol.Optional(CONF_BAUD_RATE): cv.positive_int,
|
||||
vol.Optional(CONF_TX_BUFFER_SIZE): cv.positive_int,
|
||||
vol.Optional(CONF_BAUD_RATE, default=115200): cv.positive_int,
|
||||
vol.Optional(CONF_TX_BUFFER_SIZE): cv.validate_bytes,
|
||||
vol.Optional(CONF_LEVEL): is_log_level,
|
||||
vol.Optional(CONF_LOGS): vol.Schema({
|
||||
cv.string: is_log_level,
|
||||
@@ -59,3 +72,57 @@ def required_build_flags(config):
|
||||
if CONF_LEVEL in config:
|
||||
return u'-DESPHOMELIB_LOG_LEVEL={}'.format(str(LOG_LEVELS[config[CONF_LEVEL]]))
|
||||
return None
|
||||
|
||||
|
||||
def maybe_simple_message(schema):
|
||||
def validator(value):
|
||||
if isinstance(value, dict):
|
||||
return vol.Schema(schema)(value)
|
||||
return vol.Schema(schema)({CONF_FORMAT: value})
|
||||
return validator
|
||||
|
||||
|
||||
def validate_printf(value):
|
||||
# https://stackoverflow.com/questions/30011379/how-can-i-parse-a-c-format-string-in-python
|
||||
# pylint: disable=anomalous-backslash-in-string
|
||||
cfmt = u"""\
|
||||
( # start of capture group 1
|
||||
% # literal "%"
|
||||
(?: # first option
|
||||
(?:[-+0 #]{0,5}) # optional flags
|
||||
(?:\d+|\*)? # width
|
||||
(?:\.(?:\d+|\*))? # precision
|
||||
(?:h|l|ll|w|I|I32|I64)? # size
|
||||
[cCdiouxXeEfgGaAnpsSZ] # type
|
||||
) | # OR
|
||||
%%) # literal "%%"
|
||||
"""
|
||||
matches = re.findall(cfmt, value[CONF_FORMAT], flags=re.X)
|
||||
if len(matches) != len(value[CONF_ARGS]):
|
||||
raise vol.Invalid(u"Found {} printf-patterns ({}), but {} args were given!"
|
||||
u"".format(len(matches), u', '.join(matches), len(value[CONF_ARGS])))
|
||||
return value
|
||||
|
||||
|
||||
CONF_LOGGER_LOG = 'logger.log'
|
||||
LOGGER_LOG_ACTION_SCHEMA = vol.All(maybe_simple_message({
|
||||
vol.Required(CONF_FORMAT): cv.string,
|
||||
vol.Optional(CONF_ARGS, default=list): vol.All(cv.ensure_list, [cv.lambda_]),
|
||||
vol.Optional(CONF_LEVEL, default="DEBUG"): vol.All(vol.Upper, cv.one_of(*LOG_LEVEL_TO_ESP_LOG)),
|
||||
vol.Optional(CONF_TAG, default="main"): cv.string,
|
||||
}), validate_printf)
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_LOGGER_LOG, LOGGER_LOG_ACTION_SCHEMA)
|
||||
def logger_log_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
esp_log = LOG_LEVEL_TO_ESP_LOG[config[CONF_LEVEL]]
|
||||
args = [RawExpression(unicode(x)) for x in config[CONF_ARGS]]
|
||||
|
||||
text = unicode(statement(esp_log(config[CONF_TAG], config[CONF_FORMAT], *args)))
|
||||
|
||||
for lambda_ in process_lambda(Lambda(text), [(arg_type, 'x')]):
|
||||
yield None
|
||||
rhs = LambdaAction.new(template_arg, lambda_)
|
||||
type = LambdaAction.template(template_arg)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
@@ -2,21 +2,23 @@ import re
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import automation
|
||||
from esphomeyaml.automation import ACTION_REGISTRY
|
||||
from esphomeyaml.components import logger
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_BIRTH_MESSAGE, CONF_BROKER, CONF_CLIENT_ID, CONF_DISCOVERY, \
|
||||
CONF_DISCOVERY_PREFIX, CONF_DISCOVERY_RETAIN, CONF_ID, CONF_KEEPALIVE, CONF_LOG_TOPIC, \
|
||||
CONF_ON_MESSAGE, CONF_PASSWORD, CONF_PAYLOAD, CONF_PORT, CONF_QOS, CONF_RETAIN, \
|
||||
CONF_SSL_FINGERPRINTS, CONF_TOPIC, CONF_TOPIC_PREFIX, CONF_TRIGGER_ID, CONF_USERNAME, \
|
||||
CONF_WILL_MESSAGE
|
||||
CONF_DISCOVERY_PREFIX, CONF_DISCOVERY_RETAIN, CONF_ID, CONF_KEEPALIVE, CONF_LEVEL, \
|
||||
CONF_LOG_TOPIC, CONF_ON_MESSAGE, CONF_PASSWORD, CONF_PAYLOAD, CONF_PORT, CONF_QOS, \
|
||||
CONF_REBOOT_TIMEOUT, CONF_RETAIN, CONF_SHUTDOWN_MESSAGE, CONF_SSL_FINGERPRINTS, CONF_TOPIC, \
|
||||
CONF_TOPIC_PREFIX, CONF_TRIGGER_ID, CONF_USERNAME, CONF_WILL_MESSAGE, CONF_ON_JSON_MESSAGE
|
||||
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, RawExpression, \
|
||||
StructInitializer, \
|
||||
TemplateArguments, add, esphomelib_ns, optional, std_string
|
||||
StructInitializer, TemplateArguments, add, esphomelib_ns, optional, std_string, templatable, \
|
||||
uint8, bool_, JsonObjectRef, process_lambda, JsonObjectConstRef
|
||||
|
||||
|
||||
def validate_message_just_topic(value):
|
||||
value = cv.publish_topic(value)
|
||||
return {CONF_TOPIC: value}
|
||||
return MQTT_MESSAGE_BASE({CONF_TOPIC: value})
|
||||
|
||||
|
||||
MQTT_MESSAGE_BASE = vol.Schema({
|
||||
@@ -35,7 +37,9 @@ mqtt_ns = esphomelib_ns.namespace('mqtt')
|
||||
MQTTMessage = mqtt_ns.MQTTMessage
|
||||
MQTTClientComponent = mqtt_ns.MQTTClientComponent
|
||||
MQTTPublishAction = mqtt_ns.MQTTPublishAction
|
||||
MQTTPublishJsonAction = mqtt_ns.MQTTPublishJsonAction
|
||||
MQTTMessageTrigger = mqtt_ns.MQTTMessageTrigger
|
||||
MQTTJsonMessageTrigger = mqtt_ns.MQTTJsonMessageTrigger
|
||||
|
||||
|
||||
def validate_broker(value):
|
||||
@@ -66,16 +70,25 @@ CONFIG_SCHEMA = vol.Schema({
|
||||
vol.Optional(CONF_DISCOVERY_PREFIX): cv.publish_topic,
|
||||
vol.Optional(CONF_BIRTH_MESSAGE): MQTT_MESSAGE_SCHEMA,
|
||||
vol.Optional(CONF_WILL_MESSAGE): MQTT_MESSAGE_SCHEMA,
|
||||
vol.Optional(CONF_SHUTDOWN_MESSAGE): MQTT_MESSAGE_SCHEMA,
|
||||
vol.Optional(CONF_TOPIC_PREFIX): cv.publish_topic,
|
||||
vol.Optional(CONF_LOG_TOPIC): MQTT_MESSAGE_TEMPLATE_SCHEMA,
|
||||
vol.Optional(CONF_LOG_TOPIC): vol.Any(None, MQTT_MESSAGE_BASE.extend({
|
||||
vol.Optional(CONF_LEVEL): logger.is_log_level,
|
||||
}), validate_message_just_topic),
|
||||
vol.Optional(CONF_SSL_FINGERPRINTS): vol.All(cv.only_on_esp8266,
|
||||
cv.ensure_list, [validate_fingerprint]),
|
||||
vol.Optional(CONF_KEEPALIVE): cv.positive_time_period_seconds,
|
||||
vol.Optional(CONF_ON_MESSAGE): vol.All(cv.ensure_list, [automation.AUTOMATION_SCHEMA.extend({
|
||||
vol.Optional(CONF_REBOOT_TIMEOUT): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_ON_MESSAGE): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(MQTTMessageTrigger),
|
||||
vol.Required(CONF_TOPIC): cv.publish_topic,
|
||||
vol.Optional(CONF_QOS, 0): cv.mqtt_qos,
|
||||
})])
|
||||
vol.Required(CONF_TOPIC): cv.subscribe_topic,
|
||||
vol.Optional(CONF_QOS, default=0): cv.mqtt_qos,
|
||||
}),
|
||||
vol.Optional(CONF_ON_JSON_MESSAGE): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(MQTTJsonMessageTrigger),
|
||||
vol.Required(CONF_TOPIC): cv.subscribe_topic,
|
||||
vol.Optional(CONF_QOS, default=0): cv.mqtt_qos,
|
||||
}),
|
||||
})
|
||||
|
||||
|
||||
@@ -96,46 +109,131 @@ def to_code(config):
|
||||
rhs = App.init_mqtt(config[CONF_BROKER], config[CONF_PORT],
|
||||
config[CONF_USERNAME], config[CONF_PASSWORD])
|
||||
mqtt = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if not config.get(CONF_DISCOVERY, True):
|
||||
add(mqtt.disable_discovery())
|
||||
if CONF_DISCOVERY_RETAIN in config or CONF_DISCOVERY_PREFIX in config:
|
||||
elif CONF_DISCOVERY_RETAIN in config or CONF_DISCOVERY_PREFIX in config:
|
||||
discovery_retain = config.get(CONF_DISCOVERY_RETAIN, True)
|
||||
discovery_prefix = config.get(CONF_DISCOVERY_PREFIX, 'homeassistant')
|
||||
add(mqtt.set_discovery_info(discovery_prefix, discovery_retain))
|
||||
if CONF_TOPIC_PREFIX in config:
|
||||
add(mqtt.set_topic_prefix(config[CONF_TOPIC_PREFIX]))
|
||||
|
||||
if CONF_BIRTH_MESSAGE in config:
|
||||
birth_message = config[CONF_BIRTH_MESSAGE]
|
||||
if birth_message is None:
|
||||
if not birth_message:
|
||||
add(mqtt.disable_birth_message())
|
||||
else:
|
||||
add(mqtt.set_birth_message(exp_mqtt_message(birth_message)))
|
||||
if CONF_WILL_MESSAGE in config:
|
||||
will_message = config[CONF_WILL_MESSAGE]
|
||||
if will_message is None:
|
||||
if not will_message:
|
||||
add(mqtt.disable_last_will())
|
||||
else:
|
||||
add(mqtt.set_last_will(exp_mqtt_message(will_message)))
|
||||
if CONF_SHUTDOWN_MESSAGE in config:
|
||||
shutdown_message = config[CONF_SHUTDOWN_MESSAGE]
|
||||
if not shutdown_message:
|
||||
add(mqtt.disable_shutdown_message())
|
||||
else:
|
||||
add(mqtt.set_shutdown_message(exp_mqtt_message(shutdown_message)))
|
||||
|
||||
if CONF_CLIENT_ID in config:
|
||||
add(mqtt.set_client_id(config[CONF_CLIENT_ID]))
|
||||
|
||||
if CONF_LOG_TOPIC in config:
|
||||
log_topic = config[CONF_LOG_TOPIC]
|
||||
if log_topic is None:
|
||||
if not log_topic:
|
||||
add(mqtt.disable_log_message())
|
||||
else:
|
||||
add(mqtt.set_log_topic(exp_mqtt_message(log_topic)))
|
||||
add(mqtt.set_log_message_template(exp_mqtt_message(log_topic)))
|
||||
|
||||
if CONF_LEVEL in config:
|
||||
add(mqtt.set_log_level(logger.LOG_LEVELS[config[CONF_LEVEL]]))
|
||||
|
||||
if CONF_SSL_FINGERPRINTS in config:
|
||||
for fingerprint in config[CONF_SSL_FINGERPRINTS]:
|
||||
arr = [RawExpression("0x{}".format(fingerprint[i:i + 2])) for i in range(0, 40, 2)]
|
||||
add(mqtt.add_ssl_fingerprint(ArrayInitializer(*arr, multiline=False)))
|
||||
|
||||
if CONF_KEEPALIVE in config:
|
||||
add(mqtt.set_keep_alive(config[CONF_KEEPALIVE]))
|
||||
|
||||
if CONF_REBOOT_TIMEOUT in config:
|
||||
add(mqtt.set_reboot_timeout(config[CONF_REBOOT_TIMEOUT]))
|
||||
|
||||
for conf in config.get(CONF_ON_MESSAGE, []):
|
||||
rhs = mqtt.make_message_trigger(conf[CONF_TOPIC], conf[CONF_QOS])
|
||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||
automation.build_automation(trigger, std_string, conf)
|
||||
|
||||
for conf in config.get(CONF_ON_JSON_MESSAGE, []):
|
||||
rhs = mqtt.make_json_message_trigger(conf[CONF_TOPIC], conf[CONF_QOS])
|
||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||
automation.build_automation(trigger, JsonObjectConstRef, conf)
|
||||
|
||||
|
||||
CONF_MQTT_PUBLISH = 'mqtt.publish'
|
||||
MQTT_PUBLISH_ACTION_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_TOPIC): cv.templatable(cv.publish_topic),
|
||||
vol.Required(CONF_PAYLOAD): cv.templatable(cv.mqtt_payload),
|
||||
vol.Optional(CONF_QOS): cv.templatable(cv.mqtt_qos),
|
||||
vol.Optional(CONF_RETAIN): cv.templatable(cv.boolean),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_MQTT_PUBLISH, MQTT_PUBLISH_ACTION_SCHEMA)
|
||||
def mqtt_publish_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
rhs = App.Pget_mqtt_client().Pmake_publish_action(template_arg)
|
||||
type = MQTTPublishAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_TOPIC], arg_type, std_string):
|
||||
yield None
|
||||
add(action.set_topic(template_))
|
||||
|
||||
for template_ in templatable(config[CONF_PAYLOAD], arg_type, std_string):
|
||||
yield None
|
||||
add(action.set_payload(template_))
|
||||
if CONF_QOS in config:
|
||||
for template_ in templatable(config[CONF_QOS], arg_type, uint8):
|
||||
yield
|
||||
add(action.set_qos(template_))
|
||||
if CONF_RETAIN in config:
|
||||
for template_ in templatable(config[CONF_RETAIN], arg_type, bool_):
|
||||
yield None
|
||||
add(action.set_retain(template_))
|
||||
yield action
|
||||
|
||||
|
||||
CONF_MQTT_PUBLISH_JSON = 'mqtt.publish_json'
|
||||
MQTT_PUBLISH_JSON_ACTION_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_TOPIC): cv.templatable(cv.publish_topic),
|
||||
vol.Required(CONF_PAYLOAD): cv.lambda_,
|
||||
vol.Optional(CONF_QOS): cv.mqtt_qos,
|
||||
vol.Optional(CONF_RETAIN): cv.boolean,
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_MQTT_PUBLISH_JSON, MQTT_PUBLISH_JSON_ACTION_SCHEMA)
|
||||
def mqtt_publish_json_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
rhs = App.Pget_mqtt_client().Pmake_publish_json_action(template_arg)
|
||||
type = MQTTPublishJsonAction.template(template_arg)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_TOPIC], arg_type, std_string):
|
||||
yield None
|
||||
add(action.set_topic(template_))
|
||||
|
||||
for lambda_ in process_lambda(config[CONF_PAYLOAD], [(arg_type, 'x'), (JsonObjectRef, 'root')]):
|
||||
yield None
|
||||
add(action.set_payload(lambda_))
|
||||
if CONF_QOS in config:
|
||||
add(action.set_qos(config[CONF_QOS]))
|
||||
if CONF_RETAIN in config:
|
||||
add(action.set_retain(config[CONF_RETAIN]))
|
||||
yield action
|
||||
|
||||
|
||||
def required_build_flags(config):
|
||||
if CONF_SSL_FINGERPRINTS in config:
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
import hashlib
|
||||
import logging
|
||||
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import core
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_ID, CONF_OTA, CONF_PASSWORD, CONF_PORT, CONF_SAFE_MODE, \
|
||||
ESP_PLATFORM_ESP32, ESP_PLATFORM_ESP8266
|
||||
from esphomeyaml.core import ESPHomeYAMLError
|
||||
@@ -27,8 +26,9 @@ def to_code(config):
|
||||
rhs = App.init_ota()
|
||||
ota = Pvariable(config[CONF_ID], rhs)
|
||||
if CONF_PASSWORD in config:
|
||||
hash_ = hashlib.md5(config[CONF_PASSWORD].encode()).hexdigest()
|
||||
add(ota.set_auth_password_hash(hash_))
|
||||
add(ota.set_auth_password(config[CONF_PASSWORD]))
|
||||
if CONF_PORT in config:
|
||||
add(ota.set_port(config[CONF_PORT]))
|
||||
if config[CONF_SAFE_MODE]:
|
||||
add(ota.start_safe_mode())
|
||||
|
||||
@@ -48,6 +48,7 @@ def get_auth(config):
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_OTA'
|
||||
REQUIRED_BUILD_FLAGS = '-DUSE_NEW_OTA'
|
||||
|
||||
|
||||
def lib_deps(config):
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml.automation import maybe_simple_id, ACTION_REGISTRY
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components.power_supply import PowerSupplyComponent
|
||||
from esphomeyaml.const import CONF_INVERTED, CONF_MAX_POWER, CONF_POWER_SUPPLY
|
||||
from esphomeyaml.helpers import add, esphomelib_ns, get_variable
|
||||
from esphomeyaml.const import CONF_INVERTED, CONF_MAX_POWER, CONF_POWER_SUPPLY, CONF_ID, CONF_LEVEL
|
||||
from esphomeyaml.helpers import add, esphomelib_ns, get_variable, TemplateArguments, Pvariable, \
|
||||
templatable, float_, add_job
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
@@ -14,14 +16,21 @@ BINARY_OUTPUT_SCHEMA = vol.Schema({
|
||||
vol.Optional(CONF_INVERTED): cv.boolean,
|
||||
})
|
||||
|
||||
BINARY_OUTPUT_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(BINARY_OUTPUT_SCHEMA.schema)
|
||||
|
||||
FLOAT_OUTPUT_SCHEMA = BINARY_OUTPUT_SCHEMA.extend({
|
||||
vol.Optional(CONF_MAX_POWER): cv.percentage,
|
||||
})
|
||||
|
||||
FLOAT_OUTPUT_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(FLOAT_OUTPUT_SCHEMA.schema)
|
||||
|
||||
output_ns = esphomelib_ns.namespace('output')
|
||||
TurnOffAction = output_ns.TurnOffAction
|
||||
TurnOnAction = output_ns.TurnOnAction
|
||||
SetLevelAction = output_ns.SetLevelAction
|
||||
|
||||
|
||||
def setup_output_platform(obj, config, skip_power_supply=False):
|
||||
def setup_output_platform_(obj, config, skip_power_supply=False):
|
||||
if CONF_INVERTED in config:
|
||||
add(obj.set_inverted(config[CONF_INVERTED]))
|
||||
if not skip_power_supply and CONF_POWER_SUPPLY in config:
|
||||
@@ -33,4 +42,61 @@ def setup_output_platform(obj, config, skip_power_supply=False):
|
||||
add(obj.set_max_power(config[CONF_MAX_POWER]))
|
||||
|
||||
|
||||
def setup_output_platform(obj, config, skip_power_supply=False):
|
||||
add_job(setup_output_platform_, obj, config, skip_power_supply)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_OUTPUT'
|
||||
|
||||
|
||||
CONF_OUTPUT_TURN_ON = 'output.turn_on'
|
||||
OUTPUT_TURN_ON_ACTION = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_OUTPUT_TURN_ON, OUTPUT_TURN_ON_ACTION)
|
||||
def output_turn_on_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_on_action(template_arg)
|
||||
type = TurnOnAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_OUTPUT_TURN_OFF = 'output.turn_off'
|
||||
OUTPUT_TURN_OFF_ACTION = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None)
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_OUTPUT_TURN_OFF, OUTPUT_TURN_OFF_ACTION)
|
||||
def output_turn_off_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_turn_off_action(template_arg)
|
||||
type = TurnOffAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
|
||||
|
||||
CONF_OUTPUT_SET_LEVEL = 'output.set_level'
|
||||
OUTPUT_SET_LEVEL_ACTION = vol.Schema({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(None),
|
||||
vol.Required(CONF_LEVEL): cv.templatable(cv.percentage),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_OUTPUT_SET_LEVEL, OUTPUT_SET_LEVEL_ACTION)
|
||||
def output_set_level_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_set_level_action(template_arg)
|
||||
type = SetLevelAction.template(arg_type)
|
||||
action = Pvariable(action_id, rhs, type=type)
|
||||
for template_ in templatable(config[CONF_LEVEL], arg_type, float_):
|
||||
yield None
|
||||
add(action.set_level(template_))
|
||||
yield action
|
||||
|
||||
@@ -11,17 +11,17 @@ ESP_PLATFORMS = [ESP_PLATFORM_ESP8266]
|
||||
|
||||
|
||||
def valid_pwm_pin(value):
|
||||
if value[CONF_NUMBER] >= 16:
|
||||
if value[CONF_NUMBER] > 16:
|
||||
raise ESPHomeYAMLError(u"ESP8266: Only pins 0-16 support PWM.")
|
||||
return value
|
||||
|
||||
|
||||
ESP8266PWMOutput = output.output_ns.ESP8266PWMOutput
|
||||
|
||||
PLATFORM_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = output.FLOAT_OUTPUT_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(ESP8266PWMOutput),
|
||||
vol.Required(CONF_PIN): vol.All(pins.GPIO_INTERNAL_OUTPUT_PIN_SCHEMA, valid_pwm_pin),
|
||||
}).extend(output.FLOAT_OUTPUT_SCHEMA.schema)
|
||||
vol.Required(CONF_PIN): vol.All(pins.internal_gpio_output_pin_schema, valid_pwm_pin),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -8,10 +8,10 @@ from esphomeyaml.helpers import App, Pvariable, gpio_output_pin_expression
|
||||
|
||||
GPIOBinaryOutputComponent = output.output_ns.GPIOBinaryOutputComponent
|
||||
|
||||
PLATFORM_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = output.BINARY_OUTPUT_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(GPIOBinaryOutputComponent),
|
||||
vol.Required(CONF_PIN): pins.GPIO_OUTPUT_PIN_SCHEMA,
|
||||
}).extend(output.BINARY_OUTPUT_SCHEMA.schema)
|
||||
vol.Required(CONF_PIN): pins.gpio_output_pin_schema,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -15,19 +15,19 @@ def validate_frequency_bit_depth(obj):
|
||||
bit_depth = obj.get(CONF_BIT_DEPTH, 12)
|
||||
max_freq = APB_CLOCK_FREQ / (2**bit_depth)
|
||||
if frequency > max_freq:
|
||||
raise vol.Invalid('Maximum frequency for bit depth {} is {}'.format(bit_depth, max_freq))
|
||||
raise vol.Invalid('Maximum frequency for bit depth {} is {}Hz'.format(bit_depth, max_freq))
|
||||
return obj
|
||||
|
||||
|
||||
LEDCOutputComponent = output.output_ns.LEDCOutputComponent
|
||||
|
||||
PLATFORM_SCHEMA = vol.All(output.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = vol.All(output.FLOAT_OUTPUT_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(LEDCOutputComponent),
|
||||
vol.Required(CONF_PIN): pins.output_pin,
|
||||
vol.Optional(CONF_FREQUENCY): cv.frequency,
|
||||
vol.Optional(CONF_BIT_DEPTH): vol.All(vol.Coerce(int), vol.Range(min=1, max=15)),
|
||||
vol.Optional(CONF_CHANNEL): vol.All(vol.Coerce(int), vol.Range(min=0, max=15))
|
||||
}).extend(output.FLOAT_OUTPUT_SCHEMA.schema), validate_frequency_bit_depth)
|
||||
}), validate_frequency_bit_depth)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -10,12 +10,12 @@ DEPENDENCIES = ['pca9685']
|
||||
|
||||
Channel = PCA9685OutputComponent.Channel
|
||||
|
||||
PLATFORM_SCHEMA = output.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = output.FLOAT_OUTPUT_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Channel),
|
||||
vol.Required(CONF_CHANNEL): vol.All(vol.Coerce(int),
|
||||
vol.Range(min=0, max=15)),
|
||||
cv.GenerateID(CONF_PCA9685_ID): cv.use_variable_id(PCA9685OutputComponent),
|
||||
}).extend(output.FLOAT_OUTPUT_SCHEMA.schema)
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
43
esphomeyaml/components/pn532.py
Normal file
43
esphomeyaml/components/pn532.py
Normal file
@@ -0,0 +1,43 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins, automation
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.components.spi import SPIComponent
|
||||
from esphomeyaml.const import CONF_CS_PIN, CONF_ID, CONF_SPI_ID, CONF_UPDATE_INTERVAL, \
|
||||
CONF_ON_TAG, CONF_TRIGGER_ID
|
||||
from esphomeyaml.helpers import App, Pvariable, get_variable, gpio_output_pin_expression, std_string
|
||||
|
||||
DEPENDENCIES = ['spi']
|
||||
|
||||
PN532Component = binary_sensor.binary_sensor_ns.PN532Component
|
||||
PN532Trigger = binary_sensor.binary_sensor_ns.PN532Trigger
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(PN532Component),
|
||||
cv.GenerateID(CONF_SPI_ID): cv.use_variable_id(SPIComponent),
|
||||
vol.Required(CONF_CS_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
vol.Optional(CONF_ON_TAG): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(PN532Trigger),
|
||||
}),
|
||||
})])
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
spi = None
|
||||
for spi in get_variable(conf[CONF_SPI_ID]):
|
||||
yield
|
||||
cs = None
|
||||
for cs in gpio_output_pin_expression(conf[CONF_CS_PIN]):
|
||||
yield
|
||||
rhs = App.make_pn532_component(spi, cs, conf.get(CONF_UPDATE_INTERVAL))
|
||||
pn532 = Pvariable(conf[CONF_ID], rhs)
|
||||
|
||||
for conf_ in conf.get(CONF_ON_TAG, []):
|
||||
trigger = Pvariable(conf_[CONF_TRIGGER_ID], pn532.make_trigger())
|
||||
automation.build_automation(trigger, std_string, conf_)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_PN532'
|
||||
@@ -9,7 +9,7 @@ PowerSupplyComponent = esphomelib_ns.PowerSupplyComponent
|
||||
|
||||
POWER_SUPPLY_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(PowerSupplyComponent),
|
||||
vol.Required(CONF_PIN): pins.GPIO_OUTPUT_PIN_SCHEMA,
|
||||
vol.Required(CONF_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_ENABLE_TIME): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_KEEP_ON_TIME): cv.positive_time_period_milliseconds,
|
||||
})
|
||||
|
||||
28
esphomeyaml/components/rdm6300.py
Normal file
28
esphomeyaml/components/rdm6300.py
Normal file
@@ -0,0 +1,28 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import binary_sensor
|
||||
from esphomeyaml.components.uart import UARTComponent
|
||||
from esphomeyaml.const import CONF_ID, CONF_UART_ID
|
||||
from esphomeyaml.helpers import App, Pvariable, get_variable
|
||||
|
||||
DEPENDENCIES = ['uart']
|
||||
|
||||
RDM6300Component = binary_sensor.binary_sensor_ns.RDM6300Component
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.ensure_list_not_empty, [vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(RDM6300Component),
|
||||
cv.GenerateID(CONF_UART_ID): cv.use_variable_id(UARTComponent),
|
||||
})])
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
uart = None
|
||||
for uart in get_variable(conf[CONF_UART_ID]):
|
||||
yield
|
||||
rhs = App.make_rdm6300_component(uart)
|
||||
Pvariable(conf[CONF_ID], rhs)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_RDM6300'
|
||||
64
esphomeyaml/components/remote_receiver.py
Normal file
64
esphomeyaml/components/remote_receiver.py
Normal file
@@ -0,0 +1,64 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.const import CONF_BUFFER_SIZE, CONF_DUMP, CONF_FILTER, CONF_ID, CONF_IDLE, \
|
||||
CONF_PIN, CONF_TOLERANCE
|
||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, gpio_input_pin_expression
|
||||
|
||||
remote_ns = esphomelib_ns.namespace('remote')
|
||||
|
||||
RemoteReceiverComponent = remote_ns.RemoteReceiverComponent
|
||||
|
||||
DUMPERS = {
|
||||
'lg': remote_ns.LGDumper,
|
||||
'nec': remote_ns.NECDumper,
|
||||
'panasonic': remote_ns.PanasonicDumper,
|
||||
'raw': remote_ns.RawDumper,
|
||||
'samsung': remote_ns.SamsungDumper,
|
||||
'sony': remote_ns.SonyDumper,
|
||||
'rc_switch': remote_ns.RCSwitchDumper,
|
||||
}
|
||||
|
||||
|
||||
def validate_dumpers_all(value):
|
||||
if not isinstance(value, (str, unicode)):
|
||||
raise vol.Invalid("Not valid dumpers")
|
||||
if value.upper() == "ALL":
|
||||
return list(sorted(list(DUMPERS)))
|
||||
raise vol.Invalid("Not valid dumpers")
|
||||
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(RemoteReceiverComponent),
|
||||
vol.Required(CONF_PIN): pins.gpio_input_pin_schema,
|
||||
vol.Optional(CONF_DUMP, default=[]):
|
||||
vol.Any(validate_dumpers_all,
|
||||
vol.All(cv.ensure_list, [vol.All(vol.Lower, cv.one_of(*DUMPERS))])),
|
||||
vol.Optional(CONF_TOLERANCE): vol.All(cv.percentage_int, vol.Range(min=0)),
|
||||
vol.Optional(CONF_BUFFER_SIZE): cv.validate_bytes,
|
||||
vol.Optional(CONF_FILTER): cv.positive_time_period_microseconds,
|
||||
vol.Optional(CONF_IDLE): cv.positive_time_period_microseconds,
|
||||
})])
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
pin = None
|
||||
for pin in gpio_input_pin_expression(conf[CONF_PIN]):
|
||||
yield
|
||||
rhs = App.make_remote_receiver_component(pin)
|
||||
receiver = Pvariable(conf[CONF_ID], rhs)
|
||||
for dumper in conf[CONF_DUMP]:
|
||||
add(receiver.add_dumper(DUMPERS[dumper].new()))
|
||||
if CONF_TOLERANCE in conf:
|
||||
add(receiver.set_tolerance(conf[CONF_TOLERANCE]))
|
||||
if CONF_BUFFER_SIZE in conf:
|
||||
add(receiver.set_buffer_size(conf[CONF_BUFFER_SIZE]))
|
||||
if CONF_FILTER in conf:
|
||||
add(receiver.set_filter_us(conf[CONF_FILTER]))
|
||||
if CONF_IDLE in conf:
|
||||
add(receiver.set_idle_us(conf[CONF_IDLE]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_REMOTE_RECEIVER'
|
||||
116
esphomeyaml/components/remote_transmitter.py
Normal file
116
esphomeyaml/components/remote_transmitter.py
Normal file
@@ -0,0 +1,116 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import pins
|
||||
from esphomeyaml.const import CONF_ADDRESS, CONF_CARRIER_DUTY_PERCENT, CONF_CHANNEL, CONF_CODE, \
|
||||
CONF_DEVICE, CONF_FAMILY, CONF_GROUP, CONF_ID, CONF_INVERTED, CONF_ONE, CONF_PIN, \
|
||||
CONF_PROTOCOL, CONF_PULSE_LENGTH, CONF_STATE, CONF_SYNC, CONF_ZERO
|
||||
from esphomeyaml.core import HexInt
|
||||
from esphomeyaml.helpers import App, Pvariable, add, esphomelib_ns, gpio_output_pin_expression
|
||||
|
||||
remote_ns = esphomelib_ns.namespace('remote')
|
||||
|
||||
RemoteTransmitterComponent = remote_ns.RemoteTransmitterComponent
|
||||
RCSwitchProtocol = remote_ns.RCSwitchProtocol
|
||||
rc_switch_protocols = remote_ns.rc_switch_protocols
|
||||
|
||||
|
||||
def validate_rc_switch_code(value):
|
||||
if not isinstance(value, (str, unicode)):
|
||||
raise vol.Invalid("All RCSwitch codes must be in quotes ('')")
|
||||
for c in value:
|
||||
if c not in ('0', '1'):
|
||||
raise vol.Invalid(u"Invalid RCSwitch code character '{}'. Only '0' and '1' are allowed"
|
||||
u"".format(c))
|
||||
if len(value) > 32:
|
||||
raise vol.Invalid("Maximum length for RCSwitch codes is 32, code '{}' has length {}"
|
||||
"".format(value, len(value)))
|
||||
if not value:
|
||||
raise vol.Invalid("RCSwitch code must not be empty")
|
||||
return value
|
||||
|
||||
|
||||
RC_SWITCH_TIMING_SCHEMA = vol.All([cv.uint8_t], vol.Length(min=2, max=2))
|
||||
|
||||
RC_SWITCH_PROTOCOL_SCHEMA = vol.Any(
|
||||
vol.All(vol.Coerce(int), vol.Range(min=1, max=7)),
|
||||
vol.Schema({
|
||||
vol.Required(CONF_PULSE_LENGTH): cv.uint32_t,
|
||||
vol.Optional(CONF_SYNC, default=[1, 31]): RC_SWITCH_TIMING_SCHEMA,
|
||||
vol.Optional(CONF_ZERO, default=[1, 3]): RC_SWITCH_TIMING_SCHEMA,
|
||||
vol.Optional(CONF_ONE, default=[3, 1]): RC_SWITCH_TIMING_SCHEMA,
|
||||
vol.Optional(CONF_INVERTED, default=False): cv.boolean,
|
||||
})
|
||||
)
|
||||
|
||||
RC_SWITCH_RAW_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_CODE): validate_rc_switch_code,
|
||||
vol.Optional(CONF_PROTOCOL, default=1): RC_SWITCH_PROTOCOL_SCHEMA,
|
||||
})
|
||||
RC_SWITCH_TYPE_A_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_GROUP): vol.All(validate_rc_switch_code, vol.Length(min=5, max=5)),
|
||||
vol.Required(CONF_DEVICE): vol.All(validate_rc_switch_code, vol.Length(min=5, max=5)),
|
||||
vol.Required(CONF_STATE): cv.boolean,
|
||||
vol.Optional(CONF_PROTOCOL, default=1): RC_SWITCH_PROTOCOL_SCHEMA,
|
||||
})
|
||||
RC_SWITCH_TYPE_B_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_ADDRESS): vol.All(cv.uint8_t, vol.Range(min=1, max=4)),
|
||||
vol.Required(CONF_CHANNEL): vol.All(cv.uint8_t, vol.Range(min=1, max=4)),
|
||||
vol.Required(CONF_STATE): cv.boolean,
|
||||
vol.Optional(CONF_PROTOCOL, default=1): RC_SWITCH_PROTOCOL_SCHEMA,
|
||||
})
|
||||
RC_SWITCH_TYPE_C_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_FAMILY): vol.All(
|
||||
cv.string, vol.Lower,
|
||||
cv.one_of('a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o',
|
||||
'p')),
|
||||
vol.Required(CONF_GROUP): vol.All(cv.uint8_t, vol.Range(min=1, max=4)),
|
||||
vol.Required(CONF_DEVICE): vol.All(cv.uint8_t, vol.Range(min=1, max=4)),
|
||||
vol.Required(CONF_STATE): cv.boolean,
|
||||
vol.Optional(CONF_PROTOCOL, default=1): RC_SWITCH_PROTOCOL_SCHEMA,
|
||||
})
|
||||
RC_SWITCH_TYPE_D_SCHEMA = vol.Schema({
|
||||
vol.Required(CONF_GROUP): vol.All(cv.string, vol.Lower, cv.one_of('a', 'b', 'c', 'd')),
|
||||
vol.Required(CONF_DEVICE): vol.All(cv.uint8_t, vol.Range(min=1, max=3)),
|
||||
vol.Required(CONF_STATE): cv.boolean,
|
||||
vol.Optional(CONF_PROTOCOL, default=1): RC_SWITCH_PROTOCOL_SCHEMA,
|
||||
})
|
||||
|
||||
CONFIG_SCHEMA = vol.All(cv.ensure_list, [vol.Schema({
|
||||
cv.GenerateID(): cv.declare_variable_id(RemoteTransmitterComponent),
|
||||
vol.Required(CONF_PIN): pins.gpio_output_pin_schema,
|
||||
vol.Optional(CONF_CARRIER_DUTY_PERCENT): vol.All(cv.percentage_int,
|
||||
vol.Range(min=1, max=100)),
|
||||
})])
|
||||
|
||||
|
||||
def build_rc_switch_protocol(config):
|
||||
if isinstance(config, int):
|
||||
return rc_switch_protocols[config]
|
||||
pl = config[CONF_PULSE_LENGTH]
|
||||
return RCSwitchProtocol(config[CONF_SYNC][0] * pl, config[CONF_SYNC][1] * pl,
|
||||
config[CONF_ZERO][0] * pl, config[CONF_ZERO][1] * pl,
|
||||
config[CONF_ONE][0] * pl, config[CONF_ONE][1] * pl,
|
||||
config[CONF_INVERTED])
|
||||
|
||||
|
||||
def binary_code(value):
|
||||
code = 0
|
||||
for val in value:
|
||||
code <<= 1
|
||||
code |= val == '1'
|
||||
return HexInt(code)
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
pin = None
|
||||
for pin in gpio_output_pin_expression(conf[CONF_PIN]):
|
||||
yield
|
||||
rhs = App.make_remote_transmitter_component(pin)
|
||||
transmitter = Pvariable(conf[CONF_ID], rhs)
|
||||
if CONF_CARRIER_DUTY_PERCENT in conf:
|
||||
add(transmitter.set_carrier_duty_percent(conf[CONF_CARRIER_DUTY_PERCENT]))
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_REMOTE_TRANSMITTER'
|
||||
36
esphomeyaml/components/script.py
Normal file
36
esphomeyaml/components/script.py
Normal file
@@ -0,0 +1,36 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml import automation
|
||||
from esphomeyaml.automation import ACTION_REGISTRY, maybe_simple_id
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_ID
|
||||
from esphomeyaml.helpers import NoArg, Pvariable, TemplateArguments, esphomelib_ns, get_variable
|
||||
|
||||
Script = esphomelib_ns.Script
|
||||
ScriptExecuteAction = esphomelib_ns.ScriptExecuteAction
|
||||
|
||||
CONFIG_SCHEMA = automation.validate_automation({
|
||||
vol.Required(CONF_ID): cv.declare_variable_id(Script),
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for conf in config:
|
||||
trigger = Pvariable(conf[CONF_ID], Script.new())
|
||||
automation.build_automation(trigger, NoArg, conf)
|
||||
|
||||
|
||||
CONF_SCRIPT_EXECUTE = 'script.execute'
|
||||
SCRIPT_EXECUTE_ACTION_SCHEMA = maybe_simple_id({
|
||||
vol.Required(CONF_ID): cv.use_variable_id(Script),
|
||||
})
|
||||
|
||||
|
||||
@ACTION_REGISTRY.register(CONF_SCRIPT_EXECUTE, SCRIPT_EXECUTE_ACTION_SCHEMA)
|
||||
def script_execute_action_to_code(config, action_id, arg_type):
|
||||
template_arg = TemplateArguments(arg_type)
|
||||
for var in get_variable(config[CONF_ID]):
|
||||
yield None
|
||||
rhs = var.make_execute_action(template_arg)
|
||||
type = ScriptExecuteAction.template(arg_type)
|
||||
yield Pvariable(action_id, rhs, type=type)
|
||||
@@ -4,12 +4,13 @@ import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import automation
|
||||
from esphomeyaml.const import CONF_ABOVE, CONF_ACCURACY_DECIMALS, CONF_ALPHA, CONF_BELOW, \
|
||||
CONF_DEBOUNCE, CONF_DELTA, CONF_EXPIRE_AFTER, CONF_EXPONENTIAL_MOVING_AVERAGE, CONF_FILTERS, \
|
||||
CONF_FILTER_NAN, CONF_FILTER_OUT, CONF_HEARTBEAT, CONF_ICON, CONF_ID, CONF_LAMBDA, \
|
||||
CONF_MQTT_ID, CONF_MULTIPLY, CONF_NAME, CONF_OFFSET, CONF_ON_RAW_VALUE, CONF_ON_VALUE,\
|
||||
CONF_FILTER_NAN, CONF_FILTER_OUT, CONF_HEARTBEAT, CONF_ICON, CONF_ID, CONF_INTERNAL, \
|
||||
CONF_LAMBDA, CONF_MQTT_ID, CONF_MULTIPLY, CONF_OFFSET, CONF_ON_RAW_VALUE, CONF_ON_VALUE, \
|
||||
CONF_ON_VALUE_RANGE, CONF_OR, CONF_SEND_EVERY, CONF_SLIDING_WINDOW_MOVING_AVERAGE, \
|
||||
CONF_THROTTLE, CONF_TRIGGER_ID, CONF_UNIQUE, CONF_UNIT_OF_MEASUREMENT, CONF_WINDOW_SIZE
|
||||
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, add, esphomelib_ns, float_, \
|
||||
process_lambda, setup_mqtt_component, templatable, add_job
|
||||
CONF_THROTTLE, CONF_TRIGGER_ID, CONF_UNIQUE, CONF_UNIT_OF_MEASUREMENT, CONF_WINDOW_SIZE, \
|
||||
CONF_SEND_FIRST_AT
|
||||
from esphomeyaml.helpers import App, ArrayInitializer, Pvariable, add, add_job, esphomelib_ns, \
|
||||
float_, process_lambda, setup_mqtt_component, templatable
|
||||
|
||||
PLATFORM_SCHEMA = cv.PLATFORM_SCHEMA.extend({
|
||||
|
||||
@@ -20,6 +21,15 @@ def validate_recursive_filter(value):
|
||||
return FILTERS_SCHEMA(value)
|
||||
|
||||
|
||||
def validate_send_first_at(value):
|
||||
send_first_at = value.get(CONF_SEND_FIRST_AT)
|
||||
send_every = value[CONF_SEND_EVERY]
|
||||
if send_first_at is not None and send_first_at > send_every:
|
||||
raise vol.Invalid("send_first_at must be smaller than or equal to send_every! {} <= {}"
|
||||
"".format(send_first_at, send_every))
|
||||
return value
|
||||
|
||||
|
||||
FILTER_KEYS = [CONF_OFFSET, CONF_MULTIPLY, CONF_FILTER_OUT, CONF_FILTER_NAN,
|
||||
CONF_SLIDING_WINDOW_MOVING_AVERAGE, CONF_EXPONENTIAL_MOVING_AVERAGE, CONF_LAMBDA,
|
||||
CONF_THROTTLE, CONF_DELTA, CONF_UNIQUE, CONF_HEARTBEAT, CONF_DEBOUNCE, CONF_OR]
|
||||
@@ -29,10 +39,11 @@ FILTERS_SCHEMA = vol.All(cv.ensure_list, [vol.All({
|
||||
vol.Optional(CONF_MULTIPLY): vol.Coerce(float),
|
||||
vol.Optional(CONF_FILTER_OUT): vol.Coerce(float),
|
||||
vol.Optional(CONF_FILTER_NAN): None,
|
||||
vol.Optional(CONF_SLIDING_WINDOW_MOVING_AVERAGE): vol.Schema({
|
||||
vol.Optional(CONF_SLIDING_WINDOW_MOVING_AVERAGE): vol.All(vol.Schema({
|
||||
vol.Required(CONF_WINDOW_SIZE): cv.positive_not_null_int,
|
||||
vol.Required(CONF_SEND_EVERY): cv.positive_not_null_int,
|
||||
}),
|
||||
vol.Optional(CONF_SEND_FIRST_AT): cv.positive_not_null_int,
|
||||
}), validate_send_first_at),
|
||||
vol.Optional(CONF_EXPONENTIAL_MOVING_AVERAGE): vol.Schema({
|
||||
vol.Required(CONF_ALPHA): cv.positive_float,
|
||||
vol.Required(CONF_SEND_EVERY): cv.positive_not_null_int,
|
||||
@@ -64,33 +75,33 @@ HeartbeatFilter = sensor_ns.HeartbeatFilter
|
||||
DebounceFilter = sensor_ns.DebounceFilter
|
||||
UniqueFilter = sensor_ns.UniqueFilter
|
||||
|
||||
SensorValueTrigger = sensor_ns.SensorValueTrigger
|
||||
RawSensorValueTrigger = sensor_ns.RawSensorValueTrigger
|
||||
SensorStateTrigger = sensor_ns.SensorStateTrigger
|
||||
SensorRawStateTrigger = sensor_ns.SensorRawStateTrigger
|
||||
ValueRangeTrigger = sensor_ns.ValueRangeTrigger
|
||||
|
||||
SENSOR_SCHEMA = cv.MQTT_COMPONENT_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MQTT_ID): cv.declare_variable_id(MQTTSensorComponent),
|
||||
cv.GenerateID(): cv.declare_variable_id(Sensor),
|
||||
vol.Required(CONF_NAME): cv.string,
|
||||
vol.Optional(CONF_UNIT_OF_MEASUREMENT): cv.string_strict,
|
||||
vol.Optional(CONF_ICON): cv.icon,
|
||||
vol.Optional(CONF_ACCURACY_DECIMALS): vol.Coerce(int),
|
||||
vol.Optional(CONF_EXPIRE_AFTER): vol.Any(None, cv.positive_time_period_milliseconds),
|
||||
vol.Optional(CONF_FILTERS): FILTERS_SCHEMA,
|
||||
vol.Optional(CONF_ON_VALUE): vol.All(cv.ensure_list, [automation.AUTOMATION_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(SensorValueTrigger),
|
||||
})]),
|
||||
vol.Optional(CONF_ON_RAW_VALUE): vol.All(cv.ensure_list, [automation.AUTOMATION_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(RawSensorValueTrigger),
|
||||
})]),
|
||||
vol.Optional(CONF_ON_VALUE_RANGE): vol.All(cv.ensure_list, [vol.All(
|
||||
automation.AUTOMATION_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(ValueRangeTrigger),
|
||||
vol.Optional(CONF_ABOVE): vol.Coerce(float),
|
||||
vol.Optional(CONF_BELOW): vol.Coerce(float),
|
||||
}), cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW))]),
|
||||
vol.Optional(CONF_ON_VALUE): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(SensorStateTrigger),
|
||||
}),
|
||||
vol.Optional(CONF_ON_RAW_VALUE): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(SensorRawStateTrigger),
|
||||
}),
|
||||
vol.Optional(CONF_ON_VALUE_RANGE): automation.validate_automation({
|
||||
cv.GenerateID(CONF_TRIGGER_ID): cv.declare_variable_id(ValueRangeTrigger),
|
||||
vol.Optional(CONF_ABOVE): vol.Coerce(float),
|
||||
vol.Optional(CONF_BELOW): vol.Coerce(float),
|
||||
}, cv.has_at_least_one_key(CONF_ABOVE, CONF_BELOW)),
|
||||
})
|
||||
|
||||
SENSOR_PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend(SENSOR_SCHEMA.schema)
|
||||
|
||||
|
||||
def setup_filter(config):
|
||||
if CONF_OFFSET in config:
|
||||
@@ -100,10 +111,11 @@ def setup_filter(config):
|
||||
elif CONF_FILTER_OUT in config:
|
||||
yield FilterOutValueFilter.new(config[CONF_FILTER_OUT])
|
||||
elif CONF_FILTER_NAN in config:
|
||||
yield FilterOutNANFilter()
|
||||
yield FilterOutNANFilter.new()
|
||||
elif CONF_SLIDING_WINDOW_MOVING_AVERAGE in config:
|
||||
conf = config[CONF_SLIDING_WINDOW_MOVING_AVERAGE]
|
||||
yield SlidingWindowMovingAverageFilter.new(conf[CONF_WINDOW_SIZE], conf[CONF_SEND_EVERY])
|
||||
yield SlidingWindowMovingAverageFilter.new(conf[CONF_WINDOW_SIZE], conf[CONF_SEND_EVERY],
|
||||
conf.get(CONF_SEND_FIRST_AT))
|
||||
elif CONF_EXPONENTIAL_MOVING_AVERAGE in config:
|
||||
conf = config[CONF_EXPONENTIAL_MOVING_AVERAGE]
|
||||
yield ExponentialMovingAverageFilter.new(conf[CONF_ALPHA], conf[CONF_SEND_EVERY])
|
||||
@@ -117,7 +129,10 @@ def setup_filter(config):
|
||||
elif CONF_DELTA in config:
|
||||
yield DeltaFilter.new(config[CONF_DELTA])
|
||||
elif CONF_OR in config:
|
||||
yield OrFilter.new(setup_filters(config[CONF_OR]))
|
||||
filters = None
|
||||
for filters in setup_filters(config[CONF_OR]):
|
||||
yield None
|
||||
yield OrFilter.new(filters)
|
||||
elif CONF_HEARTBEAT in config:
|
||||
yield App.register_component(HeartbeatFilter.new(config[CONF_HEARTBEAT]))
|
||||
elif CONF_DEBOUNCE in config:
|
||||
@@ -131,12 +146,14 @@ def setup_filters(config):
|
||||
for conf in config:
|
||||
filter = None
|
||||
for filter in setup_filter(conf):
|
||||
yield
|
||||
yield None
|
||||
filters.append(filter)
|
||||
yield ArrayInitializer(*filters)
|
||||
|
||||
|
||||
def setup_sensor_core_(sensor_var, mqtt_var, config):
|
||||
if CONF_INTERNAL in config:
|
||||
add(sensor_var.set_internal(config[CONF_INTERNAL]))
|
||||
if CONF_UNIT_OF_MEASUREMENT in config:
|
||||
add(sensor_var.set_unit_of_measurement(config[CONF_UNIT_OF_MEASUREMENT]))
|
||||
if CONF_ICON in config:
|
||||
@@ -150,11 +167,11 @@ def setup_sensor_core_(sensor_var, mqtt_var, config):
|
||||
add(sensor_var.set_filters(filters))
|
||||
|
||||
for conf in config.get(CONF_ON_VALUE, []):
|
||||
rhs = sensor_var.make_value_trigger()
|
||||
rhs = sensor_var.make_state_trigger()
|
||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||
automation.build_automation(trigger, float_, conf)
|
||||
for conf in config.get(CONF_ON_RAW_VALUE, []):
|
||||
rhs = sensor_var.make_raw_value_trigger()
|
||||
rhs = sensor_var.make_raw_state_trigger()
|
||||
trigger = Pvariable(conf[CONF_TRIGGER_ID], rhs)
|
||||
automation.build_automation(trigger, float_, conf)
|
||||
for conf in config.get(CONF_ON_VALUE_RANGE, []):
|
||||
@@ -164,12 +181,12 @@ def setup_sensor_core_(sensor_var, mqtt_var, config):
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_ABOVE], float_, float_):
|
||||
yield
|
||||
trigger.set_min(template_)
|
||||
add(trigger.set_min(template_))
|
||||
if CONF_BELOW in conf:
|
||||
template_ = None
|
||||
for template_ in templatable(conf[CONF_BELOW], float_, float_):
|
||||
yield
|
||||
trigger.set_max(template_)
|
||||
add(trigger.set_max(template_))
|
||||
automation.build_automation(trigger, float_, conf)
|
||||
|
||||
if CONF_EXPIRE_AFTER in config:
|
||||
|
||||
@@ -24,12 +24,12 @@ def validate_adc_pin(value):
|
||||
|
||||
MakeADCSensor = Application.MakeADCSensor
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeADCSensor),
|
||||
vol.Required(CONF_PIN): validate_adc_pin,
|
||||
vol.Optional(CONF_ATTENUATION): vol.All(cv.only_on_esp32, cv.one_of(*ATTENUATION_MODES)),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}).extend(sensor.SENSOR_SCHEMA.schema)
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -45,12 +45,12 @@ def validate_mux(value):
|
||||
return cv.one_of(*MUX)(value)
|
||||
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_MULTIPLEXER): validate_mux,
|
||||
vol.Required(CONF_GAIN): validate_gain,
|
||||
cv.GenerateID(CONF_ADS1115_ID): cv.use_variable_id(ADS1115Component),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}).extend(sensor.SENSOR_SCHEMA.schema)
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
@@ -16,12 +16,12 @@ BH1750_RESOLUTIONS = {
|
||||
|
||||
MakeBH1750Sensor = Application.MakeBH1750Sensor
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeBH1750Sensor),
|
||||
vol.Optional(CONF_ADDRESS, default=0x23): cv.i2c_address,
|
||||
vol.Optional(CONF_RESOLUTION): vol.All(cv.positive_float, cv.one_of(*BH1750_RESOLUTIONS)),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
}).extend(sensor.SENSOR_SCHEMA.schema)
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
|
||||
23
esphomeyaml/components/sensor/ble_rssi.py
Normal file
23
esphomeyaml/components/sensor/ble_rssi.py
Normal file
@@ -0,0 +1,23 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import sensor
|
||||
from esphomeyaml.components.esp32_ble_tracker import CONF_ESP32_BLE_ID, ESP32BLETracker, \
|
||||
make_address_array
|
||||
from esphomeyaml.const import CONF_MAC_ADDRESS, CONF_NAME
|
||||
from esphomeyaml.helpers import get_variable
|
||||
|
||||
DEPENDENCIES = ['esp32_ble_tracker']
|
||||
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Required(CONF_MAC_ADDRESS): cv.mac_address,
|
||||
cv.GenerateID(CONF_ESP32_BLE_ID): cv.use_variable_id(ESP32BLETracker)
|
||||
}))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
hub = None
|
||||
for hub in get_variable(config[CONF_ESP32_BLE_ID]):
|
||||
yield
|
||||
rhs = hub.make_rssi_sensor(config[CONF_NAME], make_address_array(config[CONF_MAC_ADDRESS]))
|
||||
sensor.register_sensor(rhs, config)
|
||||
@@ -34,11 +34,11 @@ MakeBME280Sensor = Application.MakeBME280Sensor
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeBME280Sensor),
|
||||
vol.Optional(CONF_ADDRESS, default=0x77): cv.i2c_address,
|
||||
vol.Required(CONF_TEMPERATURE): BME280_OVERSAMPLING_SENSOR_SCHEMA,
|
||||
vol.Required(CONF_PRESSURE): BME280_OVERSAMPLING_SENSOR_SCHEMA,
|
||||
vol.Required(CONF_HUMIDITY): BME280_OVERSAMPLING_SENSOR_SCHEMA,
|
||||
vol.Required(CONF_TEMPERATURE): cv.nameable(BME280_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Required(CONF_PRESSURE): cv.nameable(BME280_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Required(CONF_HUMIDITY): cv.nameable(BME280_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_IIR_FILTER): vol.All(vol.Upper, cv.one_of(*IIR_FILTER_OPTIONS)),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
})
|
||||
|
||||
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml import core
|
||||
from esphomeyaml.components import sensor
|
||||
from esphomeyaml.const import CONF_ADDRESS, CONF_GAS_RESISTANCE, CONF_HUMIDITY, CONF_IIR_FILTER, \
|
||||
CONF_MAKE_ID, CONF_NAME, CONF_OVERSAMPLING, CONF_PRESSURE, CONF_TEMPERATURE, \
|
||||
CONF_UPDATE_INTERVAL
|
||||
CONF_UPDATE_INTERVAL, CONF_HEATER, CONF_DURATION
|
||||
from esphomeyaml.helpers import App, Application, add, variable
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
@@ -38,13 +39,17 @@ MakeBME680Sensor = Application.MakeBME680Sensor
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeBME680Sensor),
|
||||
vol.Optional(CONF_ADDRESS, default=0x76): cv.i2c_address,
|
||||
vol.Required(CONF_TEMPERATURE): BME680_OVERSAMPLING_SENSOR_SCHEMA,
|
||||
vol.Required(CONF_PRESSURE): BME680_OVERSAMPLING_SENSOR_SCHEMA,
|
||||
vol.Required(CONF_HUMIDITY): BME680_OVERSAMPLING_SENSOR_SCHEMA,
|
||||
vol.Required(CONF_GAS_RESISTANCE): sensor.SENSOR_SCHEMA,
|
||||
vol.Required(CONF_TEMPERATURE): cv.nameable(BME680_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Required(CONF_PRESSURE): cv.nameable(BME680_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Required(CONF_HUMIDITY): cv.nameable(BME680_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Required(CONF_GAS_RESISTANCE): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_IIR_FILTER): vol.All(vol.Upper, cv.one_of(*IIR_FILTER_OPTIONS)),
|
||||
# TODO: Heater
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_HEATER): vol.Any(None, vol.All(vol.Schema({
|
||||
vol.Optional(CONF_TEMPERATURE, default=320): vol.All(vol.Coerce(int), vol.Range(200, 400)),
|
||||
vol.Optional(CONF_DURATION, default='150ms'): vol.All(
|
||||
cv.positive_time_period_milliseconds, vol.Range(max=core.TimePeriod(milliseconds=4032)))
|
||||
}, cv.has_at_least_one_key(CONF_TEMPERATURE, CONF_DURATION)))),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
})
|
||||
|
||||
|
||||
@@ -69,6 +74,12 @@ def to_code(config):
|
||||
if CONF_IIR_FILTER in config:
|
||||
constant = IIR_FILTER_OPTIONS[config[CONF_IIR_FILTER]]
|
||||
add(bme680.set_iir_filter(constant))
|
||||
if CONF_HEATER in config:
|
||||
conf = config[CONF_HEATER]
|
||||
if not conf:
|
||||
add(bme680.set_heater(0, 0))
|
||||
else:
|
||||
add(bme680.set_heater(conf[CONF_TEMPERATURE], conf[CONF_DURATION]))
|
||||
|
||||
sensor.setup_sensor(bme680.Pget_temperature_sensor(), make.Pmqtt_temperature,
|
||||
config[CONF_TEMPERATURE])
|
||||
|
||||
@@ -12,10 +12,10 @@ MakeBMP085Sensor = Application.MakeBMP085Sensor
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeBMP085Sensor),
|
||||
vol.Required(CONF_TEMPERATURE): sensor.SENSOR_SCHEMA,
|
||||
vol.Required(CONF_PRESSURE): sensor.SENSOR_SCHEMA,
|
||||
vol.Required(CONF_TEMPERATURE): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Required(CONF_PRESSURE): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_ADDRESS): cv.i2c_address,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
})
|
||||
|
||||
|
||||
|
||||
67
esphomeyaml/components/sensor/bmp280.py
Normal file
67
esphomeyaml/components/sensor/bmp280.py
Normal file
@@ -0,0 +1,67 @@
|
||||
import voluptuous as vol
|
||||
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import sensor
|
||||
from esphomeyaml.const import CONF_ADDRESS, CONF_IIR_FILTER, CONF_MAKE_ID, \
|
||||
CONF_NAME, CONF_OVERSAMPLING, CONF_PRESSURE, CONF_TEMPERATURE, CONF_UPDATE_INTERVAL
|
||||
from esphomeyaml.helpers import App, Application, add, variable
|
||||
|
||||
DEPENDENCIES = ['i2c']
|
||||
|
||||
OVERSAMPLING_OPTIONS = {
|
||||
'NONE': sensor.sensor_ns.BMP280_OVERSAMPLING_NONE,
|
||||
'1X': sensor.sensor_ns.BMP280_OVERSAMPLING_1X,
|
||||
'2X': sensor.sensor_ns.BMP280_OVERSAMPLING_2X,
|
||||
'4X': sensor.sensor_ns.BMP280_OVERSAMPLING_4X,
|
||||
'8X': sensor.sensor_ns.BMP280_OVERSAMPLING_8X,
|
||||
'16X': sensor.sensor_ns.BMP280_OVERSAMPLING_16X,
|
||||
}
|
||||
|
||||
IIR_FILTER_OPTIONS = {
|
||||
'OFF': sensor.sensor_ns.BMP280_IIR_FILTER_OFF,
|
||||
'2X': sensor.sensor_ns.BMP280_IIR_FILTER_2X,
|
||||
'4X': sensor.sensor_ns.BMP280_IIR_FILTER_4X,
|
||||
'8X': sensor.sensor_ns.BMP280_IIR_FILTER_8X,
|
||||
'16X': sensor.sensor_ns.BMP280_IIR_FILTER_16X,
|
||||
}
|
||||
|
||||
BMP280_OVERSAMPLING_SENSOR_SCHEMA = sensor.SENSOR_SCHEMA.extend({
|
||||
vol.Optional(CONF_OVERSAMPLING): vol.All(vol.Upper, cv.one_of(*OVERSAMPLING_OPTIONS)),
|
||||
})
|
||||
|
||||
MakeBMP280Sensor = Application.MakeBMP280Sensor
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeBMP280Sensor),
|
||||
vol.Optional(CONF_ADDRESS, default=0x77): cv.i2c_address,
|
||||
vol.Required(CONF_TEMPERATURE): cv.nameable(BMP280_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Required(CONF_PRESSURE): cv.nameable(BMP280_OVERSAMPLING_SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_IIR_FILTER): vol.All(vol.Upper, cv.one_of(*IIR_FILTER_OPTIONS)),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
})
|
||||
|
||||
|
||||
def to_code(config):
|
||||
rhs = App.make_bmp280_sensor(config[CONF_TEMPERATURE][CONF_NAME],
|
||||
config[CONF_PRESSURE][CONF_NAME],
|
||||
config[CONF_ADDRESS],
|
||||
config.get(CONF_UPDATE_INTERVAL))
|
||||
make = variable(config[CONF_MAKE_ID], rhs)
|
||||
bmp280 = make.Pbmp280
|
||||
if CONF_OVERSAMPLING in config[CONF_TEMPERATURE]:
|
||||
constant = OVERSAMPLING_OPTIONS[config[CONF_TEMPERATURE][CONF_OVERSAMPLING]]
|
||||
add(bmp280.set_temperature_oversampling(constant))
|
||||
if CONF_OVERSAMPLING in config[CONF_PRESSURE]:
|
||||
constant = OVERSAMPLING_OPTIONS[config[CONF_PRESSURE][CONF_OVERSAMPLING]]
|
||||
add(bmp280.set_pressure_oversampling(constant))
|
||||
if CONF_IIR_FILTER in config:
|
||||
constant = IIR_FILTER_OPTIONS[config[CONF_IIR_FILTER]]
|
||||
add(bmp280.set_iir_filter(constant))
|
||||
|
||||
sensor.setup_sensor(bmp280.Pget_temperature_sensor(), make.Pmqtt_temperature,
|
||||
config[CONF_TEMPERATURE])
|
||||
sensor.setup_sensor(bmp280.Pget_pressure_sensor(), make.Pmqtt_pressure,
|
||||
config[CONF_PRESSURE])
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_BMP280'
|
||||
42
esphomeyaml/components/sensor/cse7766.py
Normal file
42
esphomeyaml/components/sensor/cse7766.py
Normal file
@@ -0,0 +1,42 @@
|
||||
import voluptuous as vol
|
||||
|
||||
from esphomeyaml.components import sensor
|
||||
from esphomeyaml.components.uart import UARTComponent
|
||||
import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.const import CONF_CURRENT, CONF_ID, CONF_NAME, CONF_POWER, CONF_UART_ID, \
|
||||
CONF_VOLTAGE
|
||||
from esphomeyaml.helpers import App, Pvariable, get_variable
|
||||
|
||||
DEPENDENCIES = ['uart']
|
||||
|
||||
CSE7766Component = sensor.sensor_ns.CSE7766Component
|
||||
|
||||
PLATFORM_SCHEMA = vol.All(sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(): cv.declare_variable_id(CSE7766Component),
|
||||
cv.GenerateID(CONF_UART_ID): cv.use_variable_id(UARTComponent),
|
||||
|
||||
vol.Optional(CONF_VOLTAGE): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_CURRENT): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_POWER): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
}), cv.has_at_least_one_key(CONF_VOLTAGE, CONF_CURRENT, CONF_POWER))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
for uart in get_variable(config[CONF_UART_ID]):
|
||||
yield
|
||||
|
||||
rhs = App.make_cse7766(uart)
|
||||
cse = Pvariable(config[CONF_ID], rhs)
|
||||
|
||||
if CONF_VOLTAGE in config:
|
||||
conf = config[CONF_VOLTAGE]
|
||||
sensor.register_sensor(cse.make_voltage_sensor(conf[CONF_NAME]), conf)
|
||||
if CONF_CURRENT in config:
|
||||
conf = config[CONF_CURRENT]
|
||||
sensor.register_sensor(cse.make_current_sensor(conf[CONF_NAME]), conf)
|
||||
if CONF_POWER in config:
|
||||
conf = config[CONF_POWER]
|
||||
sensor.register_sensor(cse.make_power_sensor(conf[CONF_NAME]), conf)
|
||||
|
||||
|
||||
BUILD_FLAGS = '-DUSE_CSE7766'
|
||||
@@ -4,32 +4,27 @@ import esphomeyaml.config_validation as cv
|
||||
from esphomeyaml.components import sensor
|
||||
from esphomeyaml.components.dallas import DallasComponent
|
||||
from esphomeyaml.const import CONF_ADDRESS, CONF_DALLAS_ID, CONF_INDEX, CONF_NAME, \
|
||||
CONF_RESOLUTION, CONF_UPDATE_INTERVAL
|
||||
CONF_RESOLUTION
|
||||
from esphomeyaml.helpers import HexIntLiteral, get_variable
|
||||
|
||||
PLATFORM_SCHEMA = vol.All(sensor.PLATFORM_SCHEMA.extend({
|
||||
PLATFORM_SCHEMA = cv.nameable(sensor.SENSOR_PLATFORM_SCHEMA.extend({
|
||||
vol.Exclusive(CONF_ADDRESS, 'dallas'): cv.hex_int,
|
||||
vol.Exclusive(CONF_INDEX, 'dallas'): cv.positive_int,
|
||||
cv.GenerateID(CONF_DALLAS_ID): cv.use_variable_id(DallasComponent),
|
||||
vol.Optional(CONF_RESOLUTION): vol.All(vol.Coerce(int), vol.Range(min=8, max=12)),
|
||||
}).extend(sensor.SENSOR_SCHEMA.schema), cv.has_at_least_one_key(CONF_ADDRESS, CONF_INDEX))
|
||||
vol.Optional(CONF_RESOLUTION): vol.All(vol.Coerce(int), vol.Range(min=9, max=12)),
|
||||
}), cv.has_at_least_one_key(CONF_ADDRESS, CONF_INDEX))
|
||||
|
||||
|
||||
def to_code(config):
|
||||
hub = None
|
||||
for hub in get_variable(config[CONF_DALLAS_ID]):
|
||||
yield
|
||||
update_interval = config.get(CONF_UPDATE_INTERVAL)
|
||||
if CONF_RESOLUTION in config and update_interval is None:
|
||||
update_interval = 10000
|
||||
|
||||
if CONF_ADDRESS in config:
|
||||
address = HexIntLiteral(config[CONF_ADDRESS])
|
||||
rhs = hub.Pget_sensor_by_address(config[CONF_NAME], address, update_interval,
|
||||
config.get(CONF_RESOLUTION))
|
||||
rhs = hub.Pget_sensor_by_address(config[CONF_NAME], address, config.get(CONF_RESOLUTION))
|
||||
else:
|
||||
rhs = hub.Pget_sensor_by_index(config[CONF_NAME], config[CONF_INDEX],
|
||||
update_interval, config.get(CONF_RESOLUTION))
|
||||
config.get(CONF_RESOLUTION))
|
||||
sensor.register_sensor(rhs, config)
|
||||
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ from esphomeyaml.components import sensor
|
||||
from esphomeyaml.const import CONF_HUMIDITY, CONF_MAKE_ID, CONF_MODEL, CONF_NAME, CONF_PIN, \
|
||||
CONF_TEMPERATURE, CONF_UPDATE_INTERVAL
|
||||
from esphomeyaml.helpers import App, Application, add, gpio_output_pin_expression, variable
|
||||
from esphomeyaml.pins import GPIO_OUTPUT_PIN_SCHEMA
|
||||
from esphomeyaml.pins import gpio_output_pin_schema
|
||||
|
||||
DHT_MODELS = {
|
||||
'AUTO_DETECT': sensor.sensor_ns.DHT_MODEL_AUTO_DETECT,
|
||||
@@ -19,11 +19,11 @@ MakeDHTSensor = Application.MakeDHTSensor
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeDHTSensor),
|
||||
vol.Required(CONF_PIN): GPIO_OUTPUT_PIN_SCHEMA,
|
||||
vol.Required(CONF_TEMPERATURE): sensor.SENSOR_SCHEMA,
|
||||
vol.Required(CONF_HUMIDITY): sensor.SENSOR_SCHEMA,
|
||||
vol.Required(CONF_PIN): gpio_output_pin_schema,
|
||||
vol.Required(CONF_TEMPERATURE): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Required(CONF_HUMIDITY): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_MODEL): vol.All(vol.Upper, cv.one_of(*DHT_MODELS)),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
})
|
||||
|
||||
|
||||
|
||||
@@ -12,9 +12,9 @@ MakeDHT12Sensor = Application.MakeDHT12Sensor
|
||||
|
||||
PLATFORM_SCHEMA = sensor.PLATFORM_SCHEMA.extend({
|
||||
cv.GenerateID(CONF_MAKE_ID): cv.declare_variable_id(MakeDHT12Sensor),
|
||||
vol.Required(CONF_TEMPERATURE): sensor.SENSOR_SCHEMA,
|
||||
vol.Required(CONF_HUMIDITY): sensor.SENSOR_SCHEMA,
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.positive_time_period_milliseconds,
|
||||
vol.Required(CONF_TEMPERATURE): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Required(CONF_HUMIDITY): cv.nameable(sensor.SENSOR_SCHEMA),
|
||||
vol.Optional(CONF_UPDATE_INTERVAL): cv.update_interval,
|
||||
})
|
||||
|
||||
|
||||
@@ -24,9 +24,9 @@ def to_code(config):
|
||||
config.get(CONF_UPDATE_INTERVAL))
|
||||
dht = variable(config[CONF_MAKE_ID], rhs)
|
||||
|
||||
sensor.setup_sensor(dht.Pdht.Pget_temperature_sensor(), dht.Pmqtt_temperature,
|
||||
sensor.setup_sensor(dht.Pdht12.Pget_temperature_sensor(), dht.Pmqtt_temperature,
|
||||
config[CONF_TEMPERATURE])
|
||||
sensor.setup_sensor(dht.Pdht.Pget_humidity_sensor(), dht.Pmqtt_humidity,
|
||||
sensor.setup_sensor(dht.Pdht12.Pget_humidity_sensor(), dht.Pmqtt_humidity,
|
||||
config[CONF_HUMIDITY])
|
||||
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user