a robust room presence solution for home automation with nearly no false negatives

Overview

Argos Room Presence

docker pulls

This project builds a room presence solution on top of Argos. Using just a cheap raspberry pi zero w (plus an attached pi camera, or an RTMP enabled camera] in any room, this project provides a reliable, robust and efficient mechanism to detect if people are present in that room or not, allowing you to then use it as an MQTT sensor in HomeAssistant for automating lighting, media players, heating or what have you in that room, as a consequence.

Preface

For a long time, I've been trying to figure out a robust room presence solution for my home automation setup within HomeAssistant. I've tried everything from integrating simple battery operated wifi-connected PIR motion sensors to bluetooth LE based detection using a cluster of raspberry pi zero's running room-assistant. It all works half of the time and then crumbles down like a house of cards rest of the time. After a lot of tinkering and tuning, I have now devised a solution, which I can confidently proclaim, is completely devoid of any false positives or false negatives, and works reliably in the real world. And trust me, when it comes to room presence, what matters is false negatives! Folks absolutely hate you when your fancy home automation turned off the TV and all the lights when everyone was tuned into that game/movie!

So, whats the secret sauce?

argos-presence provides reliable room presence by using computer vision. Forget all the thermal imaging sensors, PIR motion sensors and bluetooth scanning solutions. Raspberry Pi's can now run sophisticated computer vision algorithms and even machine learning algorithms (thank you tensorflow lite!) thanks to all the performance advancements in both single board computers and advancements in OpenCV and tensorflow.

Here's how argos-presence works:

arch-argos-presence

The executive summary is the following:

  • Argos-presence is detecting movement (even the tiniest finger or facial movement), and then making sure the movement is actually a person when it matters.
  • We dont simply set the presenceStatus based on motion. We have warmUp and coolDown periods.
  • When motion tries to switch presenceStatus from on to off, we have a coolDown period where the argos object detection service is called to figure out if there's a person in the scene, and we keep extending the cool down till a person is detected. This is to avoid false negatives
  • When motion tries to switch presenceStatus from off to on, we have a warmUp period where, again we detect if a person is present or not. This is to avoid false positives. For example, if your presenceStatus recently went from on to off, your lights are in the process of turning off, which can be seen as motion by the detector. If we did not have a warmUp period, your room would keep flipping the lights on and off continuously. Note: this doesn't mean you have to wait 30 seconds (warmUp seconds) for your lights to turn on. During warmup, it terminates the warmup and switches to presenceStatus ON immediately if a person is detected (and only if…). The warmUp period is only in effect after a recent change from presence to no presence (from last motion). The other times whenever you come into the room, argos will go into presence status immediately (with just motion). You can even turn off warmUp mode by setting warmUp seconds to 0.
  • The warmup and cooldown periods need to be configured (sensible tried and tested defaults are already set in the example config) to accommodate for your environment.
  • In effect, your lights (or whatever you choose to turn on in your automation) will turn ON instantly, but they’ll take coolDown seconds (e.g. 5 minutes) to turn off and will ONLY turn off when there’s no computer vision detected human in the frame (so no false negatives - a big principle on which the project was built)

Installation

Depending on the deployment model you chose, you may have to install argos on the same device as argos-presence or on a different device. Follow the instructions to install argos here. You still have to clone argos and make it available in the PYTHONPATH as argos-presence uses it's motion detection API. This is done in the systemd service file provided.

On your pi:

cd ~
git clone https://github.com/angadsingh/argos-presence
sudo apt-get install python3-pip
pip3 install --upgrade pip
sudo apt-get install python3-venv
python3 -m venv argos-presence-venv/
source argos-presence-venv/bin/activate
pip install wheel
pip install -r argos-presence/requirements.txt

install it as a systemd service:

sudo cp argos-presence/resources/systemd/argos_presence.service /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl enable argos_presence.service
sudo systemctl start argos_presence

see the logs

journalctl --unit argos_presence.service -f
As a docker container

You can use the following instructions to install argos-presence as a docker container.

Install docker (optional)

curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh

Run argos-presence as a docker container

Note: only docker images for armv7 (raspberry pi 2/3/4) and armv6 (raspberry pi zero/w) are provided. It should be trivial to support an amd64 dockerfile as well -armv6/v7 are much harder to figure out (contributions welcome!).

replace "armv7" with "armv6" in the below instructions for running on raspberry pi zero (/w)!

docker run --rm -p8000:8000 -v configs:/configs \
    -v /home/pi/motion_frames:/motion_frames angadsingh/argos-presence:arvmv7 \
    --config configs.your_config --camconfig configs.your_camconfig

make a systemd service to run it automatically. these services automatically download the latest docker image and run them for you:

sudo wget https://raw.githubusercontent.com/angadsingh/argos-presence/main/resources/systemd/argos_presence_docker.service -P /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl enable argos_presence_docker.service
sudo systemctl start argos_presence_docker

see the logs

journalctl --unit argos_presence_docker.service -f

Usage

You can run Argos Room Presence in the following modes of deployment:

Deployment Description False positives False negatives Speed
Standalone (not recommended) You can use Argos Room Presence (this project) standalone on a cheap and little a device as raspberry pi zero w. It will only use the argos motion detection API to do room presence, and does not use the tensorflow ML based person detection (object detection). When people in the room don't move for the configured period, this will cause false negatives None Sometimes Fast
Remote person detection (recommended) Run Argos Room Presence (this project) on a $15 rpi zero w (with an attached pi camera, which come as cheap as $8), configured to do person detection using a remote Argos object detection service running on atleast a raspberry pi 4, or nvidia jetson nano, or an old laptop or Mac mini (needs to be able to run tensorflow lite at decent FPS) None None Faster
Local person detection (ideal) Run both the Argos object detection service as well as room presence (this project) on the same device. This avoids a network hop. You can use this mode if you don't intend to use a pi camera and can do presence detection from an existing RTMP camera in a room. None None Fastest

You can run room presence like this (although its best to run it as a systemd service as described in the installation section):

PYTHONPATH=$PYTHONPATH:/home/pi/argos presence.py --ip 0.0.0.0 --port 8000 --config config --camconfig camconfig

Just like argos, argos-presence also exposes:

  • a flask server which serves a web page where you can see the motion and person detection happening in action
  • an image and video feed for you to use as a camera in HA (more on that below)
  • APIs to get current status, set config and even set PiCamera configuration dynamically
Method Endpoint Description
Browse / will show a web page with the real time processing of the video stream (shows /video_feed)
GET /status status shows the current load, motion status
GET /config shows the config
GET /config?<param>=<value> will let you edit any config parameter without restarting the service
GET /config shows the PiCamera config
GET /camconfig?<param>=<value> will let you edit any PiCamera config parameter without restarting the service
GET /image returns the latest frame as a JPEG image (useful in HA generic camera platform)
GET /video_feed streams an MJPEG video stream of the motion and person detector (useful in HA generic camera platform)

Home Assistant Integration

Once argos-presence is up and running, streaming your picamera or RTMP camera feed, doing motion detection, doing local or remote object detection by calling the argos service or API, and sending presence state to HA via MQTT, you can create an MQTT sensor and automation in HA to act on that presence state

Create the MQTT sensor:

sensor:
  - platform: mqtt
    name: "Argos Room Presence Sensor Living Room"
    state_topic: "home-assistant/argos-room-presence/living_room"
    unit_of_measurement: "presence"

a sample automation which turns on/off a scene in a room based on presence:

alias: Living Room Presence Based Scene Activation
description: ''
trigger:
  - platform: state
    entity_id: sensor.argos_room_presence_sensor_living_room
    to: '1'
    from: '0'
  - platform: state
    entity_id: sensor.argos_room_presence_sensor_living_room
    from: '1'
    to: '0'
condition: []
action:
  - choose:
      - conditions:
          - condition: state
            entity_id: sensor.argos_room_presence_sensor_living_room
            state: '1'
        sequence:
          - scene: scene.living_room_on_scene
      - conditions:
          - condition: state
            entity_id: sensor.argos_room_presence_sensor_living_room
            state: '0'
        sequence:
          - scene: scene.living_room_off_scene
    default: []
mode: restart

one thing that we need to take care of is to accomodate for lighting changes during the day. argos-presence exposes a /camconfig API which lets you change any PiCamera property dynamically. We'll use this API to change the camera settings for day and night lighting changes. This is necessary, as otherwise argus won't be able to detect motion when the lights are off at night, for example. The below automation changes the exposure settings for the PiCamera at night such that it can see in low light as well! (you'll be surprised how well that $8 camera sees in low light with the right settings!)

alias: Apply Argos Presence Dynamic Camera Settings
description: ''
trigger:
  - platform: time
    at: '17:30'
  - platform: time
    at: '06:30'
  - platform: homeassistant
    event: start
condition: []
action:
  - choose:
      - conditions:
          - condition: time
            after: '17:30'
            before: '06:30'
        sequence:
          - service: rest_command.argos_living_room_sensor_set_camconfig
            data:
              exposure_mode: 'off'
              framerate: 2
              iso: 0
              image_denoise: ''
              video_denoise: ''
              video_stabilization: ''
              shutter_speed: 499829
              meter_mode: spot
              exposure_compensation: 0
              awb_mode: 'off'
              awb_gains_red: 1.5625
              awb_gains_blue: 1.20703125
      - conditions:
          - condition: time
            after: '06:30'
            before: '17:30'
        sequence:
          - service: rest_command.argos_living_room_sensor_set_camconfig
            data:
              exposure_mode: auto
              framerate: 5
              iso: 0
              image_denoise: 1
              video_denoise: 1
              video_stabilization: ''
              shutter_speed: 0
              meter_mode: average
              exposure_compensation: 0
              awb_mode: auto
              awb_gains_red: 1
              awb_gains_blue: 1
    default: []
mode: single

for the above to work, we need to create some REST commands:

rest_command:
  argos_living_room_sensor_reset_bg:
    url: 'http://<argos-presence-host>:8000/config?reset_bg_model=True'
    timeout: 120
  argos_living_room_sensor_set_config:
    url: "http://<argos-presence-host>:8000/config?bg_accum_weight={{ states('input_text.argos_presence_sensor_background_accumulation_weight') }}&min_cont_area={{ states('input_text.argos_presence_sensor_minimum_contour_area') }}&tval={{ states('input_text.argos_presence_sensor_tval') }}&video_feed_fps={{ states('input_text.argos_presence_sensor_video_feed_fps') }}&presence_cooldown_secs={{ states('input_text.argos_presence_sensor_presence_cooldown_secs') }}&presence_warmup_secs={{ states('input_text.argos_presence_sensor_presence_warmup_secs') }}&argos_person_detection_enabled={{ states('input_text.argos_presence_sensor_person_detection_enabled') }}&argos_detection_threshold={{ states('input_text.argos_presence_sensor_person_detection_threshold') }}&argos_detection_frequency_frames={{ states('input_text.argos_presence_sensor_person_detection_frequency_frames') }}"
    timeout: 120
  argos_living_room_sensor_set_camconfig:
    url: "http://<argos-presence-host>:8000/camconfig?exposure_mode={{exposure_mode}}&framerate={{framerate}}&iso={{iso}}&image_denoise={{image_denoise}}&video_denoise={{video_denoise}}&video_stabilization={{video_stabilization}}&shutter_speed={{shutter_speed}}&meter_mode={{meter_mode}}&exposure_compensation={{exposure_compensation}}&awb_mode={{awb_mode}}&awb_gains_red={{awb_gains_red}}&awb_gains_blue={{awb_gains_blue}}"
    timeout: 120

you can create a lovelace tab for managing argos configuration from HA itself. first create some input helpers to take in text input for the properties you want to be able to change from the UI and then use the following lovelace card. this uses the REST command created above for the /config API (not /camconfig):

type: vertical-stack
cards:
  - type: entities
    entities:
      - entity: input_text.argos_presence_sensor_background_accumulation_weight
        name: Background Accumulation Weight
      - entity: input_text.argos_presence_sensor_tval
        name: Threshold tval
      - entity: input_text.argos_presence_sensor_minimum_contour_area
        name: Minimum Contour Area
      - entity: input_text.argos_presence_sensor_video_feed_fps
        name: Video Feed FPS
      - entity: input_text.argos_presence_sensor_presence_cooldown_secs
        name: Presence Cool Down Seconds
      - entity: input_text.argos_presence_sensor_presence_warmup_secs
        name: Presence Warm Up Seconds
  - type: button
    tap_action:
      action: call-service
      service: rest_command.argos_living_room_sensor_set_config
    name: Set config

it will look like this:

lastly, you can add the argo-presence video feed as a camera in home assistant! doing this you can then use a lovelace picture glance card to see your presence detector (doing motion detector and tensorflow based person detection) in action, and even record it's footage using motionEye, etc.

camera:
- platform: mjpeg
    name: "Argos Living Room Presence Cam"
    still_image_url: "http://<argos-presence-host>:8000/image"
    mjpeg_url: "http://<argos-presence-host>:8000/video_feed"

add it to Lovelace:

aspect_ratio: 0%
camera_image: camera.argos_living_room_cam
entities:
  - entity: camera.argos_living_room_cam
title: Living Room Presence Cam
type: picture-glance

here's how my presence tab looks like in lovelace:

argos-presence-tab-ha

Privacy

If you have privacy concerns about your presence camera video/image feed entering the network (and an attacker potentially getting access to it), then you can set the following config settings to completely disable the output frame

self.output_frame_enabled = False

You may also want to install the argos object detector locally.

You might also like...
A discord program that will send a message to nearly every user in a discord server
A discord program that will send a message to nearly every user in a discord server

Discord Mass DM Scrapes users from a discord server to promote/mass dm Report Bug · Request Feature Features Asynchronous Easy to use Free Auto scrape

An advanced Filter Bot with nearly unlimitted filters!

Unlimited Filter Bot ㅤㅤㅤㅤㅤㅤㅤ ㅤㅤㅤㅤㅤㅤㅤ An advanced Filter Bot with nearly unlimitted filters! Features Nearly unlimited filters Supports all type of fil

An advanced Filter Bot with nearly unlimitted filters
An advanced Filter Bot with nearly unlimitted filters

Telegram MTProto API Framework for Python Documentation • Releases • Community Pyrogram from pyrogram import Client, filters app = Client("my_account

Repo Home WPDrawBot - (Repo, Home, WP) A powerful programmatic 2D drawing application for MacOS X which generates graphics from Python scripts. (graphics, dev, mac)

DrawBot DrawBot is a powerful, free application for macOS that invites you to write Python scripts to generate two-dimensional graphics. The built-in

a spacial-temporal pattern detection system for home automation
a spacial-temporal pattern detection system for home automation

Argos a spacial-temporal pattern detection system for home automation. Based on OpenCV and Tensorflow, can run on raspberry pi and notify HomeAssistan

Open source home automation that puts local control and privacy first
Open source home automation that puts local control and privacy first

Home Assistant Open source home automation that puts local control and privacy first. Powered by a worldwide community of tinkerers and DIY enthusiast

BoneIO is a compact IO controller for home automation.
BoneIO is a compact IO controller for home automation.

Project description BoneIO is a compact IO controller for home automation. Main features of this controller are Compact size (27x11x6)cm - 15 DIN modu

What if home automation was homoiconic? Just transformations of data? No more YAML!

radiale what if home-automation was also homoiconic? The upper or proximal row contains three bones, to which Gegenbaur has applied the terms radiale,

Yet another python home automation project. Because a smart light is more than just on or off
Yet another python home automation project. Because a smart light is more than just on or off

Automate home Yet another home automation project because a smart light is more than just on or off. Overview When talking about home automation there

Open source home automation that puts local control and privacy first.
Open source home automation that puts local control and privacy first.

Home Assistant Open source home automation that puts local control and privacy first. Powered by a worldwide community of tinkerers and DIY enthusiast

Speech Recognition is an important feature in several applications used such as home automation, artificial intelligence

Speech Recognition is an important feature in several applications used such as home automation, artificial intelligence, etc. This article aims to provide an introduction on how to make use of the SpeechRecognition and pyttsx3 library of Python.

Domoticz-hyundai-kia - Domoticz Hyundai-Kia plugin for Domoticz home automation system

Domoticz Hyundai-Kia plugin Author: Creasol https://www.creasol.it/domotics For

Solution for automation games play-to-earn

Pillow automation used processing images

Discord rich-presence implementation for VALORANT
Discord rich-presence implementation for VALORANT

not working on v1 anymore in favor of v2, but if there's any big bugs i'll try to fix them valorant-rich-presence-client Discord rich presence extensi

Unofficial Discord Rich Presence for HackTheBox platform
Unofficial Discord Rich Presence for HackTheBox platform

HTBRichPresence Unofficial Discord Rich Presence for HackTheBox platform The project is under lazy development. How to run Install requirements: // I'

Simple Python-based web application to allow UGM students to fill their QR presence list without having another device in hand.

Praesentia Praesentia is a simple Python-based web application to allow UGM students to fill their QR presence list without having another device in h

OpenEmu Discord Rich Presence provided with Python!

A simple application that provides your current OpenEmu game as an RPC state in Discord via PyPresence. How to use Unzip and open the latest x86_64 ve

A custom Discord Rich Presence to display when you're studying so you're stupid friends won't disturb you when you're studying.

Studying RPC Description A custom Discord Rich Presence to display when you're studying so you're stupid friends won't disturb you when you're studyin

This is a Machine Learning model which predicts the presence of Diabetes in Patients

Diabetes Disease Prediction This is a machine Learning mode which tries to determine if a person has a diabetes or not. Data The dataset is in comma s

Comments
  • Trouble getting this working Sure it is a setup issue.

    Trouble getting this working Sure it is a setup issue.

    As you can tell, I am very green. If it wasn't for my advanced age you could call me a "script kiddie" when it comes to linux, and pi and such. I usually need a step by step guide. Since I couldn't find one, I decided to give this a try and keep track of everything I did so that when I got it working I could post the step by step or give it to you and let you incorporate it after you cleaned it up if you wanted to.

    This will have the steps I have taken so far and the information you requested from my reddit post. If this doesn't have the information you are needing to see, please be explicit about what you need and how I get it.

    I am using a Pi4 and a Wyze x2 camera with the firmware to allow RTSP video. I am running Home Assistant on a NUC using proxmox.

    Use the Raspberry Pi Imager to write Raspberry Pi Desktop (32 bit) released 2021-01-11. I have a 64 GB SD card. Select the SD card to write to then Write. Create an empty file named SSH on the root of the newly created SD card. This will allow you to SSH into the Pi after it boots up. I also created a wpa_supplicant.conf file and copied it to the root. This should get the Pi on my home wireless but I intend to have it running off an ethernet cable.

    Here are the contents of the wpa_supplicant.conf file:

    country=us update_config=1 ctrl_interface=/var/run/wpa_supplicant

    network={ scan_ssid=1 ssid="MySSID" psk="MySSIDPassword" }

    Obviously, I live in the United States. You would have to adjust based on your location and enter your correct Wi-Fi information.

    Put the SD card into your Pi and turn it on. Like I said, I have mine connected to ethernet. Use FING or your router or some other network tool to determine what IP address the Pi got.

    Then use Putty (or some other SSH tool) to connect to the Pi. The default username is pi and the default password is raspberry.

    After logging in, my first step is to update to the latest software by issuing this command: sudo apt-get update -y ; sudo apt-get upgrade -y ; sudo apt-get dist-upgrade -y

    That will take a few minutes depending on your hardware and internet connection.

    Then from the "Installation" section from this page https://github.com/angadsingh/argos#installation I entered the following commands:

    cd ~ git clone https://github.com/angadsingh/argos sudo apt-get install python3-pip sudo apt-get install python3-venv pip3 install --upgrade pip python3 -m venv argos-venv/ source argos-venv/bin/activate pip install https://github.com/bitsy-ai/tensorflow-arm-bin/releases/download/v2.4.0/tensorflow-2.4.0-cp37-none-linux_armv7l.whl (This pauses and takes a little while. Just give it time) pip install wheel pip install -r argos/requirements.txt

    #The next 3 lines are only required for tf2 - I assume this is tensorflow 2. I installed it.

    git clone https://github.com/tensorflow/models.git cd models/research/object_detection/packages/tf2 python -m pip install . --no-deps

    Now we will make a systemd service to run it automatically: sudo cp resources/systemd/argos_serve.service /etc/systemd/system/ sudo cp resources/systemd/argos_stream.service /etc/systemd/system/ sudo systemctl daemon-reload sudo systemctl enable argos_serve.service sudo systemctl enable argos_stream.service (per the author, I don't need this line so I will disable) sudo systemctl start argos_serve sudo systemctl start argos_stream

    To see the logs: journalctl --unit argos_stream.service -f (control c to get out of the log view)

    Now installing Argos presence from the Installation section on this page: https://github.com/angadsingh/argos-presence

    cd ~ git clone https://github.com/angadsingh/argos-presence sudo apt-get install python3-pip pip3 install --upgrade pip sudo apt-get install python3-venv python3 -m venv argos-presence-venv/ source argos-presence-venv/bin/activate pip install wheel pip install -r argos-presence/requirements.txt

    install it as a systemd service:

    sudo cp argos-presence/argos_presence.service /etc/systemd/system/ sudo systemctl daemon-reload sudo systemctl enable argos_presence.service sudo systemctl start argos_presence

    To see the logs: journalctl --unit argos_presence.service -f

    I was stuck at this point so I contacted you on reddit. You told me to do the following: need to edit the argos presence for argos_service_api_url config (example here: https://github.com/angadsingh/argos-presence/blob/main/configs/config_example.py ). Since it is running on the same device, http://localhost:8080/detect should be correct. Also set up mqtt information in this config file.

    That is what I did and I suspect that may not be correct to edit the config_example.py ??

    You then told me I could open a browser to http://192.168.1.60:8000 (my Pi's IP address) to see if I get anything. I don't. Just an error in Firefox that it is unable to connect.

    You then requested I open a problem tkt on github so that is what I am doing. (My first time so forgive me if I make a mistake here).

    While on the site, I was looking at the other problems reported and I think I gleaned some valuable information. One thing I didn't think was correct was that I had never configured argos to know where my camera was.

    I created my own stream config via: $ cp configs/examples/config_tflite_ssd_stream.py configs/livingroom_stream.py and then modified the file.

    In that file I changed "self.send_mqtt from False to True and changed self.send_webhook from True to False. I wanted to see some output on mqtt before I started messing with the webhooks. Also in that file on the last line I found where to set the RTSP stream.

    So now those lines look like this: # supports RTMP, picamera and local video file self.input_mode = InputMode.RTMP_STREAM self.rtmp_stream_url = "rtsp://wyze01:[email protected]/live"

    I then started stream (I think) with this command and got these results: pi@raspberrypi:~/argos $ python stream.py --ip 0.0.0.0 --port 8080 --config configs.livingroom_stream File "stream.py", line 48 def init(self, config, object_detector: BaseTFObjectDetector, pattern_detector: PatternDetector): ^ SyntaxError: invalid syntax pi@raspberrypi:~/argos $

    And I tried to start presence with this command with these results: pi@raspberrypi:~/argos-presence $ python presence.py --ip 0.0.0.0 --port 8000 --config config --camconfig camconfig File "presence.py", line 221 def init(self, presence_detector: PresenceDetector): ^ SyntaxError: invalid syntax pi@raspberrypi:~/argos-presence $

    Here are the logs you requested. There were no logs displayed for the first one: pi@raspberrypi:~/argos $ journalctl --unit argos_stream.service -f -- Logs begin at Thu 2021-02-04 09:42:55 GMT. -- ^C pi@raspberrypi:~/argos $

    pi@raspberrypi:~/argos $ journalctl --unit argos_presence.service -f -- Logs begin at Thu 2021-02-04 09:42:55 GMT. -- Feb 04 23:54:52 raspberrypi python[6468]: Traceback (most recent call last): Feb 04 23:54:52 raspberrypi python[6468]: File "/home/pi/argos-presence/presence.py", line 4, in Feb 04 23:54:52 raspberrypi python[6468]: from detection.motion_detector import SimpleMotionDetector Feb 04 23:54:52 raspberrypi python[6468]: File "/home/pi/argos/detection/motion_detector.py", line 1, in Feb 04 23:54:52 raspberrypi python[6468]: import cv2 Feb 04 23:54:52 raspberrypi python[6468]: File "/home/pi/argos-presence-venv/lib/python3.7/site-packages/cv2/init.py", line 5, in Feb 04 23:54:52 raspberrypi python[6468]: from .cv2 import * Feb 04 23:54:52 raspberrypi python[6468]: ImportError: libcblas.so.3: cannot open shared object file: No such file or directory Feb 04 23:54:52 raspberrypi systemd[1]: argos_presence.service: Main process exited, code=exited, status=1/FAILURE Feb 04 23:54:52 raspberrypi systemd[1]: argos_presence.service: Failed with result 'exit-code'. ^C pi@raspberrypi:~/argos $

    I wasn't sure what you were looking for when you requested systemd files but here is what I have in that folder that is argos related.

    Here is the argos_presence.service in the /etc/systemd/system folder:

    [Unit] Description=argos presence detector

    [Service] Environment=PYTHONUNBUFFERED=1 Environment=PYTHONPATH=$PYTHONPATH:/home/pi/argos WorkingDirectory=/home/pi/argos-presence ExecStart=/home/pi/argos-presence-venv/bin/python /home/pi/argos-presence/prese$ Restart=always RestartSec=5s

    [Install] WantedBy=default.target

    Here is the argos_serve.service in the /etc/systemd/system folder:

    [Unit] Description=argos object detection service

    [Service] Environment=PYTHONUNBUFFERED=1 WorkingDirectory=/home/pi/argos ExecStart=/home/pi/argos-venv/bin/python /home/pi/argos/serve.py --ip 0.0.0.0 -$ Restart=always RestartSec=5s StandardOutput=append:/var/log/argos_serve.log StandardError=inherit

    [Install] WantedBy=default.target

    Here is the argos_stream.service in the /etc/systemd/system folder:

    [Unit] Description=argos streaming service

    [Service] Environment=PYTHONUNBUFFERED=1 WorkingDirectory=/home/pi/argos ExecStart=/home/pi/argos-venv/bin/python /home/pi/argos/stream.py --ip 0.0.0.0 $ Restart=always RestartSec=5s StandardOutput=append:/var/log/argos_stream.log StandardError=inherit

    [Install] WantedBy=default.target

    opened by j0hnboy75067 14
Owner
Angad Singh
Angad Singh
A host-guest based app in which host can CREATE the room. and guest can join room with room code and vote for song to skip. User is authenticated using Spotify API

A host-guest based app in which host can CREATE the room. and guest can join room with room code and vote for song to skip. User is authenticated using Spotify API

Aman Raj 5 May 10, 2022
A Discord Rich Presence App to set your own custom rich presence.

discord-rich-presence A Discord Rich Presence App to set your own custom rich presence. #BUILDS Ready to use package are available inside "finalpackag

null 1 Nov 22, 2021
EZ Presence - A GUI-Python app which makes it easy to set a custom Discord Rich Presence. (BETA)

EZ Presence EZ Presence is a GUI-Python app which makes it easy to set any custom Discord Rich Presence. Using the App How to Run Since the app is in

notsniped 2 Mar 1, 2022
Fully Automated YouTube Channel ▶️with Added Extra Features.

Fully Automated Youtube Channel ▒█▀▀█ █▀▀█ ▀▀█▀▀ ▀▀█▀▀ █░░█ █▀▀▄ █▀▀ █▀▀█ ▒█▀▀▄ █░░█ ░░█░░ ░▒█░░ █░░█ █▀▀▄ █▀▀ █▄▄▀ ▒█▄▄█ ▀▀▀▀ ░░▀░░ ░▒█░░ ░▀▀▀ ▀▀▀░

sam-sepiol 249 Jan 2, 2023
Optimizing DR with hard negatives and achieving SOTA first-stage retrieval performance on TREC DL Track (SIGIR 2021 Full Paper).

Optimizing Dense Retrieval Model Training with Hard Negatives Jingtao Zhan, Jiaxin Mao, Yiqun Liu, Jiafeng Guo, Min Zhang, Shaoping Ma This repo provi

Jingtao Zhan 99 Dec 27, 2022
PyTorch Code for the paper "VSE++: Improving Visual-Semantic Embeddings with Hard Negatives"

Improving Visual-Semantic Embeddings with Hard Negatives Code for the image-caption retrieval methods from VSE++: Improving Visual-Semantic Embeddings

Fartash Faghri 441 Dec 5, 2022
An advanced Filter Bot with nearly unlimitted filters!

Unlimited Filter Bot ㅤㅤㅤㅤㅤㅤㅤ ㅤㅤㅤㅤㅤㅤㅤ An advanced Filter Bot with nearly unlimitted filters! Features Nearly unlimited filters Supports all type of fil

TroJanzHEX 445 Jan 3, 2023
Python for .NET is a package that gives Python programmers nearly seamless integration with the .NET Common Language Runtime (CLR) and provides a powerful application scripting tool for .NET developers.

pythonnet - Python.NET Python.NET is a package that gives Python programmers nearly seamless integration with the .NET Common Language Runtime (CLR) a

null 3.5k Jan 6, 2023
Codes accompanying the paper "Learning Nearly Decomposable Value Functions with Communication Minimization" (ICLR 2020)

NDQ: Learning Nearly Decomposable Value Functions with Communication Minimization Note This codebase accompanies paper Learning Nearly Decomposable Va

Tonghan Wang 69 Nov 26, 2022