Simple SummaryIn order to keep dairy cows under satisfactory health and welfare conditions, it is very important to monitor the animals in their living environment. With the support of technology, and, in particular, with the installation of sensors on neck-collars, cow behavior can be adequately monitored, and different behavioral patterns can be classified. In this study, an open and customizable device has been developed to classify the behaviors of dairy cows. The device communicates with a mobile application via Bluetooth to acquire raw data from behavioral observations and via an ad hoc radio channel to send the data from the device to the gateway. After observing 32 cows on 3 farms for a total of 108 h, several machine learning algorithms were trained to classify their behaviors. The decision tree algorithm was found to be the best compromise between complexity and accuracy to classify standing, lying, eating, and ruminating. The open nature of the system enables the addition of other functions (e.g., localization) and the integration with other information sources, e.g., climatic sensors, to provide a more complete picture of cow health and welfare in the barn.Monitoring dairy cattle behavior can improve the detection of health and welfare issues for early interventions. Often commercial sensors do not provide researchers with sufficient raw and open data; therefore, the aim of this study was to develop an open and customizable system to classify cattle behaviors. A 3D accelerometer device and host-board (i.e., sensor node) were embedded in a case and fixed on a dairy cow collar. It was developed to work in two modes: (1) acquisition mode, where a mobile application supported the raw data collection during observations; and (2) operating mode, where data was processed and sent to a gateway and on the cloud. Accelerations were sampled at 25 Hz and behaviors were classified in 10-min windows. Several algorithms were trained with the 108 h of behavioral data acquired from 32 cows on 3 farms, and after evaluating their computational/memory complexity and accuracy, the Decision Tree algorithm was selected. This model detected standing, lying, eating, and ruminating with an average accuracy of 85.12%. The open nature of this system enables for the addition of other functions (e.g., real-time localization of cows) and the integration with other information sources, e.g., microenvironment and air quality sensors, thereby enhancing data processing potential.