Published in last 50 years
Articles published on Eating Occasions
- Research Article
- 10.1038/s41598-025-09427-8
- Jul 4, 2025
- Scientific Reports
- Azadeh Lesani + 1 more
Late energy intake (EI) is linked to increased obesity; however, the relationship between circadian eating patterns—including timing (morning vs. evening) energy and macronutrients, eating frequency, and eating window duration—and metabolic syndrome (MetS) in Iranian women remains insufficiently elucidated, particularly across age groups, menopausal statuses, and diurnal preference. In this cross-sectional study, dietary intake of 574 women aged 20 to 60 years from Tehran was assessed using three 24-hour dietary recalls. diurnal preference was evaluated through the Morningness-Eveningness Questionnaire. The analysis focused on eveningness in EI and macronutrient intake (%evening - %morning), eating occasions (EOs), and eating window duration. Anthropometric measurements, blood pressure, glucose, and lipid levels, were recorded. Generalized linear regression was utilized. Eveningness of EI was related to increased MetS risk T (tertile) 3 vs.T1 (ORs (95% CIs); 0.35 (0.11–0.62), p = 0.03). Also, the number of EOs T3 vs.T1 ( -0.68 (-1.32 – -0.23), p = 0.02) was related to decreased MetS. Eveningness of EI was linked to risk of elevated fasting blood glucose, T3 vs. T1 (0.46 (0.09–0.91), p = 0.02), Additionally, T3 vs. T1 in the Eveningness of protein showed a significant decrease in TG, (-0.56 (-1.01 - -0.12); p = 0.01). No associations were found in stratified by age, menopausal status, and chronotype. Consuming fewer meals along with a higher evening energy—might from non-protein sources—might be associated with increased the risk of MetS cross-sectionally, emphasizing the need for longitudinal studies to deepen our understanding of these relationships.
- Research Article
- 10.1016/j.tjnut.2025.05.014
- Jul 1, 2025
- The Journal of nutrition
- Samuel Scott + 7 more
What Adults in Rural South Asia Eat and When They Eat It: Evidence From Bangladesh, India, and Nepal.
- Research Article
- 10.3390/nu17111806
- May 26, 2025
- Nutrients
- Bi Xue Patricia Soh + 4 more
Background/Objectives: Inadequate intake of indispensable amino acids (IAAs) is a significant challenge in vegan diets. Since IAAs are not produced or stored over long durations in the human body, regular and balanced dietary protein consumption throughout the day is essential for metabolic function. The objective of this study is to investigate the variation in protein and IAA intake across 24 h among New Zealand vegans with time-series clustering, using Dynamic Time Warping (DTW). Methods: This data-driven approach objectively categorised vegan dietary data into distinct clusters for protein intake and protein quality analysis. Results: Total protein consumed per eating occasion (EO) was 11.1 g, with 93.5% of the cohort falling below the minimal threshold of 20 g of protein per EO. The mean protein intake for each EO in cluster 1 was 6.5 g, cluster 2 was 11.4 g and only cluster 3 was near the threshold at 19.0 g. IAA intake was highest in cluster 3, with lysine and leucine being 3× higher in cluster 3 than cluster 1. All EOs in cluster 1 were below the reference protein intake relative to body weight, closely followed by cluster 2 (91.5%), while cluster 3 comparatively had the lowest EOs under this reference (31.9%). Conclusions: DTW produced three distinct dietary patterns in the vegan cohort. Further exploration of plant protein combinations could inform recommendations to optimise protein quality in vegan diets.
- Research Article
- 10.1017/s0029665125000333
- Apr 1, 2025
- Proceedings of the Nutrition Society
- H.R.B Arini + 3 more
Previous studies have shown the health benefits of daily total protein intake(1), yet temporal protein patterns in the population have rarely been investigated. The currently available studies have examined the associations between total protein intake at eating occasions (EOs) with cardiometabolic(2) and muscular health(3) but have not accounted for different protein sources. This study aimed to describe temporal patterns of total, plant, and animal protein intake at EOs in Australian adults, and to examine these patterns according to their sociodemographic and eating pattern characteristics (e.g., meal and snack frequencies, amount of protein intake). Using the 2011–12 Australian National Nutrition and Physical Activity Survey data, this study included adults aged ≥ 19 years who completed one 24-hour dietary recall (n = 6741). Total, animal and plant protein intake at self-reported EOs was estimated using the AUSNUT 2011–13 nutrient database and Australian Dietary Guidelines (ADG) food classification system(4). Plant protein included grains, nuts, and other plant-based, protein-containing foods, while animal protein consisted of meats, dairy, and other animal-source foods. Separate latent variable mixture models were used to identify temporal patterns of total, animal, and plant protein based on hourly intakes of total, animal, and plant protein, respectively. Pearson’s Chi-square test (for categorical variables) and one-way analysis of variance (for continuous variables) were used to examine the differences in participant characteristics between latent classes of temporal protein patterns. Three latent classes for men’s and women’s intake of total, animal, and plant proteins were identified. Class 1 was characterised by high probabilities of consuming protein at the usual Australian mealtime (e.g., dinner at 18:00–19:00h), and participants in this class were significantly older than the other two classes (all, p < 0.001). Class 2 had a high probability of eating protein an hour later than the mealtime of Class 1 and the highest protein intake from meals (all, p < 0.001), except for men’s total protein and women’s plant protein. Participants in Class 2 of total (all, p < 0.001), animal (all, p < 0.001), and plant protein (women only, p = 0.02) were characterised by high income and employment status. Participants in Class 3 had the lowest meal frequency (all, p < 0.001) and the lowest total, animal, and plant protein intakes from meals (all, p < 0.001), but the highest intakes from snacks (p < 0.001), except for women’s animal protein intake. Most adults in Class 3 of total (men only, p < 0.001) and animal protein (all, p < 0.001) also had high education level, lived in urban areas, and were not married. Three temporal protein patterns with distinct characteristics were identified in this study. Future studies need to investigate whether these temporal protein intake patterns are associated with health outcomes.
- Research Article
- 10.1016/j.ajcnut.2025.01.012
- Mar 1, 2025
- The American journal of clinical nutrition
- Francisca Ibacache + 3 more
Investigating eating architecture and the impact of the precision of recorded eating time: a cross-sectional study.
- Research Article
- 10.3390/foods14020276
- Jan 16, 2025
- Foods (Basel, Switzerland)
- Ileana Baldi + 5 more
Wearable devices equipped with a range of sensors have emerged as promising tools for monitoring and improving individuals' health and lifestyle. Contribute to the investigation and development of effective and reliable methods for dietary monitoring based on raw kinetic data generated by wearable devices. This study uses resources from the NOTION study. A total of 20 healthy subjects (9 women and 11 men, aged 20-31 years) were equipped with two commercial smartwatches during four eating occasions under semi-naturalistic conditions. All meals were video-recorded, and acceleration data were extracted and analyzed. Food recognition on these features was performed using random forest (RF) models with 5-fold cross-validation. The performance of the classifiers was expressed in out-of-bag sensitivity and specificity. Acceleration along the x-axis and power show the highest and lowest rates of median variable importance, respectively. Increasing the window size from 1 to 5 s leads to a gain in performance for almost all food items. The RF classifier reaches the highest performance in identifying meatballs (89.4% sensitivity and 81.6% specificity) and the lowest in identifying sandwiches (74.6% sensitivity and 72.5% specificity). Monitoring food items using simple wristband-mounted wearable devices is feasible and accurate for some foods while unsatisfactory for others. Machine learning tools are necessary to deal with the complexity of signals gathered by the devices, and research is ongoing to improve accuracy further and work on large-scale and real-time implementation and testing.
- Research Article
- 10.1017/s0007114524002745
- Nov 20, 2024
- The British journal of nutrition
- Azadeh Lesani + 5 more
Chrono-nutrition is an emerging field that examines how the frequency and timing of meals impact health. Previous research shows inconsistency in the relationship between chrono-nutritional components and cardiometabolic health. We investigated cross-sectional associations between these components and cardiometabolic health in 825 Iranian adults aged 20-59 years. Dietary data, including the number of eating occasions, meal timing and meal irregularity of energy intake, were collected using three 24-h dietary recalls. Anthropometric measurements, blood pressure and laboratory tests (fasting plasma glucose, lipid profile, insulin, uric acid and C-reactive protein) were conducted. Insulin resistance and sensitivity (homeostatic model assessment for insulin resistance, homeostatic model assessment for insulin sensitivity), the TAG-glucose, the lipid accommodation product and BMI were calculated. The demographic and morning-evening questionnaire was completed. General linear regression was used to assess associations between chrono-nutritional components and outcomes. Interactions with age and BMI were examined in all associations. Chrono-nutrition components were not significantly related to cardiometabolic risk factors in the total population. However, a lower number of eating occasions was associated with an increased LDL-cholesterol:HDL-cholesterol ratio (β (95 % CI): 0·26 (0·06, 0·48)) among overweight and obese participants. Additionally, less irregularity in breakfast energy intake was associated with a lower total cholesterol:HDL-cholesterol ratio (-0·37 (-0·95, -0·18)) and a lower LDL-cholesterol:HDL-cholesterol ratio (-0·32 (-0·79, -0·13)) among participants with a normal BMI (all P< 0·05). The study concluded that more frequent meals and regular energy intake might enhance cardiometabolic health cross-sectionally, highlighting the need for prospective studies to further investigate these associations and the mediating role of BMI.
- Research Article
- 10.1017/s0029665124005767
- Nov 1, 2024
- Proceedings of the Nutrition Society
- N.R Tran + 3 more
Poor diet quality among young adults contributes to increased rates of overweight and obesity(1). Improving diet quality requires small and achievable changes in eating behaviours(2). Personalised nutrition interventions offer a promising strategy to modify behaviour and subsequently enhance diet quality but require input data on individuals past behaviour and their environmental contexts to ensure advice is relevant and effective(3). Machine learning (ML) is a useful tool for predicting behaviours, but few studies have explored the integration of ML capabilities into precision nutrition applications(4–6). Therefore, this study used ML to investigate whether contextual factors occurring at eating occasions (EO) predict food consumption and, consequently, overall daily diet quality.Analyses were conducted on cross-sectional data from the Measuring Eating in Everyday Life Study (MEALS) (7–8). Participants (aged 18-30 years, n = 675) recorded dietary intakes at EO (i.e. meals and snacks) in near-real time (3-4 non-consecutive days) using a Smartphone food diary app. Contextual factors for each EO were recorded via the app and categorised as socialenvironmental factors (e.g. activity, persons present while eating) and physical-environmental factors (e.g. consumption location, purchase location). Person level factors describing participant characteristics were collected during an online survey. Intake (servings per EO) of vegetables, fruits, grains, meat, dairy, and discretionary foods were estimated, as per Australian Dietary Guidelines. Gradient boost decision tree(9) and random forest models(10) were chosen a priori ; decision tree provide explainable ML, while random forest improves accuracy(11). Their performance was evaluated using 10-fold cross-validation, comparing mean absolute error (MAE), root mean square error (RMSE), and R squared. Feature importance analysis was performed to understand important variables for predicting food consumption. All analysis was performed using R.Results indicate that ML can predict most food groups at EO using contextual factors, with an acceptable range in differences between actual consumption and predicted consumption (<1 serving per EO). For instance, MEA values for fruits, dairy, and meat were 0.35, 0.34, and 0.56 servings, respectively. This suggests that, on average, models’ predictions are off by 0.35 servings of fruits per EO using contextual factors (RMSE values for fruit, dairy, and meat were 0.61, 0.50, and 0.80 servings, respectively). Notably, when investigating the influences of different contextual factors on models’ predictions, feature importance analysis indicated that person level factors such as self-efficacy and age were considered highly important, while person presence and purchase locations ranked highly in importance within eating occasion-level factors across most food groups.ML can offer valuable insights into the interplay between contextual factors and food consumption. Future research should investigate which contextual factors, when modified, lead to favourable dietary behaviours, and incorporate these findings into precision nutrition interventions.
- Research Article
5
- 10.1186/s12889-024-20512-x
- Oct 29, 2024
- BMC Public Health
- Huan Zhang + 12 more
BackgroundGallstones are strongly associated with eating occasion (EO) and energy distribution, but few studies have addressed this aspect. Therefore, we utilize the data from 2017 to 2018 National Health and Nutrition Examination Survey (NHANES) to explore the association between temporal eating patterns and energy distribution patterns with the incidence of gallstones.MethodsOur study comprised participants who completed the NHANES dietary intake interview and self-reported health questionnaire at age 20 or older. The self-report method for gallstones (have you ever been told by a doctor) was used. We use the latent class analysis (LCA) identified temporal eating patterns, and identified energy distribution patterns through latent profile analysis (LPA). The association between temporal eating patterns, energy distribution patterns, and gallstones was examined using logistic regression modeling.ResultsThe study included a total of 4,692 participants. LCA identified four temporal eating patterns labeled as “Conventional,” “Early breakfast,” “Later breakfast,” and “Grazing.” Compared to the “Conventional” pattern, the “Early breakfast” pattern (OR 0.809, 95%CI 0.808–0.811) was associated with a reduced risk of gallstones, while the “Later breakfast” (OR 1.435, 95%CI 1.432–1.438) and “Grazing” (OR 1.147, 95%CI 1.145–1.148) patterns were associated with an increased risk of gallstones. LPA identified four energy distribution patterns labeled as “Guideline,” “High carbohydrates,” “Carbs-fat balance,” and “High fat.” The “High carbohydrates” pattern (OR 1.329, 95%CI 1.326–1.331) was associated with an increased risk of gallstones compared to the “Guideline” pattern. The “Carbs-fat balance” pattern (OR 0.877, 95%CI 0.876–0.879) and the “High fat” pattern (OR 0.848, 95%CI 0.846-0.850) were significantly and negatively associated with the risk of gallstones.ConclusionsTo summarize, inappropriate timing of eating and energy sources are associated with gallstones. As a dietary prevention measure for gallstones, we suggest adhering to a regular eating routine and avoiding overly casual and frequent food consumption. If the main EO routine occurs in the morning, this time should not exceed 9:00 a.m. Additionally, reducing carbohydrate intake and maintaining a moderate level of fat intake is believed to contribute to a lower risk of gallstones.
- Research Article
- 10.1093/eurpub/ckae144.296
- Oct 28, 2024
- European Journal of Public Health
- N R Tran + 3 more
Abstract Background Improving diet quality relies on making manageable adjustments to eating behaviours. Personalised nutrition interventions hold promise for modifying behaviour. Machine learning (ML) offers a novel approach to examining dietary behaviours in personalised nutrition by leveraging data on past behaviours and environmental contexts. This study aims to investigate whether contextual factors at eating occasions (EO) can predict food consumption to enhance diet quality. Methods Cross sectional data from the Measuring Eating in Everyday Life Study (MEALS) were analysed (n = 675, 18-35y). A smartphone food diary app recorded dietary intakes at EO for 3-4 non-consecutive days, also capturing social-environmental (e.g., activity) and physical-environmental factors (e.g., consumption location). Participant characteristics were collected via an online survey. Food groups intake (servings per EO) followed Australian Dietary Guidelines. This study benchmarked two established models, gradient boost decision tree and random forest, which have previously shown high performance in similar tasks. Performance was evaluated using 10-fold cross-validation, measuring mean absolute error (MEA), root mean square error (RMSE), and R squared. Feature importance analysis identified key variables for predicting food consumption. Results ML predicts most food groups at EO using contextual factors, with slight differences between actual and predicted consumption (&lt;1 serving per EO). For fruits, dairy, and meat, MEA values were 0.35, 0.34, and 0.56 servings, respectively (RMSE values: 0.61, 0.50, and 0.80 servings). Self-efficacy, age and consumption location were influential in most ML models. Conclusions ML offers insights into contextual factors and food consumption, suggesting directions for precision nutrition interventions. Future research should identify positive influences of contextual factors on dietary behaviours and incorporate these insights into interventions. Key messages • Machine learning can effectively predict food consumption based on contextual factors at eating occasions. • Understanding the influence of contextual factors such as consumption location on food consumption can inform the development of precision nutrition interventions aimed at improving diet quality.
- Research Article
1
- 10.1186/s12966-024-01639-x
- Sep 12, 2024
- International Journal of Behavioral Nutrition and Physical Activity
- Luciana Pons-Muzzo + 8 more
BackgroundAltered meal timing patterns can disrupt the circadian system and affect metabolism. Our aim was to describe sex-specific chrono-nutritional patterns, assess their association with body mass index (BMI) and investigate the role of sleep in this relationship.MethodsWe used the 2018 questionnaire data from the population-based Genomes for Life (GCAT) (n = 7074) cohort of adults aged 40–65 in Catalonia, Spain, for cross-sectional analysis and its follow-up questionnaire data in 2023 (n = 3128) for longitudinal analysis. We conducted multivariate linear regressions to explore the association between mutually adjusted meal-timing variables (time of first meal, number of eating occasions, nighttime fasting duration) and BMI, accounting for sleep duration and quality, and additional relevant confounders including adherence to a Mediterranean diet. Finally, cluster analysis was performed to identify chrono-nutritional patterns, separately for men and women, and sociodemographic and lifestyle characteristics were compared across clusters and analyzed for associations with BMI.ResultsIn the cross-sectional analysis, a later time of first meal (β 1 h increase = 0.32, 95% CI 0.18, 0.47) and more eating occasions (only in women, β 1 more eating occasion = 0.25, 95% CI 0.00, 0.51) were associated with a higher BMI, while longer nighttime fasting duration with a lower BMI (β 1 h increase=-0.27, 95% CI -0.41, -0.13). These associations were particularly evident in premenopausal women. Longitudinal analyses corroborated the associations with time of first meal and nighttime fasting duration, particularly in men. Finally, we obtained 3 sex-specific clusters, that mostly differed in number of eating occasions and time of first meal. Clusters defined by a late first meal displayed lower education and higher unemployment in men, as well as higher BMI for both sexes. A clear “breakfast skipping” pattern was identified only in the smallest cluster in men.ConclusionsIn a population-based cohort of adults in Catalonia, we found that a later time of first meal was associated with higher BMI, while longer nighttime fasting duration associated with a lower BMI, both in cross-sectional and longitudinal analyses.
- Research Article
3
- 10.3390/nu16091295
- Apr 26, 2024
- Nutrients
- Leinys S Santos-Báez + 10 more
This observational pilot study examined the association between diet, meal pattern and glucose over a 2-week period under free-living conditions in 26 adults with dysglycemia (D-GLYC) and 14 with normoglycemia (N-GLYC). We hypothesized that a prolonged eating window and late eating occasions (EOs), along with a higher dietary carbohydrate intake, would result in higher glucose levels and glucose variability (GV). General linear models were run with meal timing with time-stamped photographs in real time, and diet composition by dietary recalls, and their variability (SD), as predictors and glucose variables (mean glucose, mean amplitude of glucose excursions [MAGE], largest amplitude of glucose excursions [LAGE] and GV) as dependent variables. After adjusting for calories and nutrients, a later eating midpoint predicted a lower GV (β = -2.3, SE = 1.0, p = 0.03) in D-GLYC, while a later last EO predicted a higher GV (β = 1.5, SE = 0.6, p = 0.04) in N-GLYC. A higher carbohydrate intake predicted a higher MAGE (β = 0.9, SE = 0.4, p = 0.02) and GV (β = 0.4, SE = 0.2, p = 0.04) in N-GLYC, but not D-GLYC. In summary, our data suggest that meal patterns interact with dietary composition and should be evaluated as potential modifiable determinants of glucose in adults with and without dysglycemia. Future research should evaluate causality with controlled diets.
- Research Article
3
- 10.1080/07420528.2024.2342937
- Apr 25, 2024
- Chronobiology International
- Yan Yin Phoi + 4 more
ABSTRACT The irregular eating patterns of both shift workers and evening chronotypes adversely affect cardiometabolic health. A tool that conveniently captures temporal patterns of eating alongside an indicator of circadian rhythm such as chronotype will enable researchers to explore relationships with diverse health outcome measures. We aimed to investigate the test-retest reliability and convergent validity of a Chrononutrition Questionnaire (CNQ) that captures temporal patterns of eating and chronotype in the general population (non-shift workers, university students, retirees, unemployed individuals) and shift work population. Participants attended two face-to-face/virtual sessions and completed the CNQ and food/sleep/work diaries. Outcomes included subjective chronotype, wake/sleep/mid-sleep time, sleep duration, meal/snack regularity, meal/snack/total frequency, times of first/last/largest eating occasions (EO), main meal (MM) 1/2/3, and duration of eating window (DEW). 116 participants enrolled (44.5 ± 16.5 years, BMI: 27.3 ± 5.8 kg/m2, 73% female, 52% general population); 105 completed the study. Reliability was acceptable for chronotype, sleep, and all temporal eating patterns except on night shifts. Convergent validity was good for chronotype and sleep except for certain shift/shift-free days. Generally, meal/snack regularity and frequency, and times of first/last EO showed good validity for the general population but not shift workers. Validity was good for DEW (except work-free days and afternoon shifts) and times of MM 1/2/3 (except afternoon and night shifts), while time of largest EO had poor validity. The CNQ has good test-retest reliability and acceptable convergent validity for the general and shift work population, although it will benefit from further validation, especially regarding regularity, frequency, and times of first and last eating occasions across more days amongst a larger sample size of shift workers. Use of the CNQ by researchers will expand our current understanding of chrononutrition as relationships between timing of food intake and the multitude of health outcomes are examined.
- Research Article
- 10.1161/circ.149.suppl_1.p201
- Mar 19, 2024
- Circulation
- Yoriko Heianza + 6 more
Introduction: Chronic disruption of circadian rhythms is linked to weight gain and metabolic dysregulation. Time-restricted eating (TRE), a form of intermittent fasting, has shown effectiveness in improving short-term weight loss and energy homeostasis. However, associations between habitual adherence to TRE and long-term weight change remain understudied. Hypothesis: We tested whether adherence to TRE assessed by the gold-standard seven-day dietary records (7DDRs) was related to 5-year change in body weight among women. Methods: The present analysis included 650 women (mean [SD]: age 63 [9] y; BMI 26.4 [5.3] kg/m 2 ) without cardiovascular disease at baseline who completed 7DDRs in a sub-study of the Nurses’ Health Studies (NHS/NHSII), the Women’s Lifestyle Validation Study (WLVS) (2010-12). TRE was indicated by 4-12 hours of daily eating window (EW). The adherence to TRE was assessed by summing up the TRE days in a week. We also calculated 7-day averaged values of EW hours, last/first time of eating occasion (EO), and within-person variability of these variables. Weight change from baseline to a follow-up survey (NHS: 2016-18; NHSII: 2015-17) were analyzed. Total energy expenditure (TEE) and physical activity expenditure (PAEE) were measured at baseline using the doubly labeled water dilution method. Results: Longer averaged hours of EW were related to greater weight gain (β 0.7 [0.3] kg per 2 hours) after adjusting for covariates of demographic factors, total energy intake, physical activity, alcohol, within-person variation of EW, the averaged time of last EO, and the initial body weight ( p =0.02). Consistently, adherence to TRE was associated with less weight gain (β -0.25 [SE 0.1] kg per day increment; p =0.03) in a model adjusting for these covariates. We found that lower adherence to TRE and longer hours of EW were associated with weight gain ( p <0.05 for both) when the last eating time occurred at a later time in the day than at an earlier time. The baseline energy metabolism (TEE and PAEE) modified the associations between the hours of EW and weight changes ( P interaction-EW-TEE = 0.006; P interaction-EW-PAEE =0.01), showing significant relationships particularly among women with higher levels of adiposity and greater energy expenditure at baseline. Conclusions: Adherence to TRE and fewer habitual hours of eating were related to long-term weight changes among middle-aged and elderly women. The last eating time and energy homeostasis may partly modify the associations.
- Research Article
3
- 10.1161/circ.149.suppl_1.p192
- Mar 19, 2024
- Circulation
- Meng Chen + 1 more
Introduction: Time-restricted eating (TRE) has gained popularity as a dietary intervention that limits daily food consumption to a 4- to 12-hour window. Most short-term randomized controlled trials reported that TRE improved cardiometabolic risk profiles. However, whether TRE is associated with long-term hard endpoints remains unknown. Hypothesis: We assessed the hypothesis that TRE is associated with a reduced risk of all-cause and cause-specific mortality. Methods: Participants aged at least 20 years who completed two valid 24-hour dietary recalls and reported usual intake in both recalls were included from the National Health and Nutrition Examination Survey in 2003-2018. Mortality status as of December 2019 was obtained through linkage to the National Death Index. An eating occasion required consuming more than 5 kcal of foods or beverages. Eating duration between the last and first eating occasion was calculated for each day. The average duration of two recall days defined typical eating duration which was then categorized as <8, 8-<10, 10-<12, 12-16 (reference group; mean duration in US adults), and >16 hours. Multivariable Cox proportional hazards models were employed to estimate the association of eating duration with all-cause and cause-specific mortality in the overall sample and among adults with cardiovascular disease or cancer. Adjusted hazard ratios (HRs) and 95% confidence intervals (CIs) were derived. Results: Among 20,078 adults included, the weighted mean (SE) age was 48.5 (0.3) years, 50.0% were men, and 73.3% were non-Hispanic White. During a median follow-up of 8.0 years (IQR, 4.2-11.8), 2797 all-cause deaths occurred, including 840 cardiovascular deaths and 643 cancer deaths. Compared with eating duration of 12-16 hours, eating duration <8 hours was significantly associated with an increased risk of cardiovascular mortality (HR, 1.96 [95% CI, 1.23-3.13]); this association was also observed in adults with cardiovascular disease (HR, 2.06 [95% CI, 1.12-3.81]) and adults with cancer (HR, 2.72 [95% CI, 1.28-5.80]). Other eating durations were not associated with cardiovascular mortality, except for eating duration of 8-<10 hours in people with cardiovascular disease (HR, 1.64 [95% CI, 1.02-2.63]). No significant associations were found between eating duration and all-cause or cancer mortality in the overall sample and diseased subsamples, except that eating duration >16 hours was associated with a lower risk of cancer mortality in people with cancer (HR, 0.46; [95% CI, 0.22-0.95]). Conclusions: In US adults, TRE with eating duration <8 hours was significantly associated with a higher risk of cardiovascular mortality in the general population as well as in people with cardiovascular disease or cancer. These findings do not support long-term use of 16:8 TRE for preventing cardiovascular death.
- Research Article
2
- 10.1017/s0007114523002350
- Oct 31, 2023
- British Journal of Nutrition
- Jenna Rahkola + 8 more
Later timing of eating has been associated with higher adiposity among adults and children in several studies, but not all. Moreover, studies in younger children are scarce. Hence, this study investigated the associations of the timing of evening eating with BMI Z-score and waist-to-height ratio (WHtR), and whether these associations were moderated by chronotype among 627 preschoolers (3-6-year-olds) from the cross-sectional DAGIS survey in Finland. Food intake was measured with 3-d food records, and sleep was measured with hip-worn actigraphy. Three variables were formed to describe the timing of evening eating: (1) clock time of the last eating occasion (EO); (2) time between the last EO and sleep onset; and (3) percentage of total daily energy intake (%TDEI) consumed 2 h before sleep onset or later. Chronotype was assessed as a sleep debt-corrected midpoint of sleep on the weekend (actigraphy data). The data were analysed with adjusted linear mixed effects models. After adjusting for several confounders, the last EO occurring closer to sleep onset (estimate = -0·006, 95 % CI (-0·010, -0·001)) and higher %TDEI consumed before sleep onset (estimate = 0·0004, 95 % CI (0·00003, 0·0007)) were associated with higher WHtR. No associations with BMI Z-score were found after adjustments. Clock time of the last EO was not significantly associated with the outcomes, and no interactions with chronotype emerged. The results highlight the importance of studying the timing of eating relative to sleep timing instead of only as clock time.
- Research Article
- 10.1093/sleepadvances/zpad035.138
- Oct 23, 2023
- Sleep Advances
- Y Phoi + 4 more
Abstract Chrononutrition investigates temporal patterns of eating. Irregular patterns in shift workers and evening chronotypes adversely affect cardiometabolic health. We investigated the test-retest reliability and convergent validity of a Chrononutrition Questionnaire that aims to capture temporal patterns of eating and chronotype in shift and non-shift workers. 58 non-shift and 47 shift workers completed the study. Outcomes include: 1) chronotype, 2) sleep: wake/sleep/mid-sleep time and sleep duration, 3) temporal eating patterns: meal/snack regularity and frequency, times of first/last/largest eating occasions (EO), main meal (MM) 1/2/3, and duration of eating window (DEW) on work and work-free days (non-shift) and morning/afternoon/night/work-free days (shift workers). Test-retest reliability (intraclass correlation coefficients and weighted kappa), and convergent validity was determined against food and sleep diaries (Spearman Rank Coefficients). Reliability was acceptable for chronotype, sleep, and all temporal eating patterns except morning (last EO) and night shifts (last EO, DEW). Convergent validity was good for chronotype and sleep except for wake times and/or sleep duration on work-free days after morning and afternoon shifts. Meal/snack regularity and frequency showed good validity for non-shift but not shift workers. Times of first/last EO, MM1/2/3 and DEW generally showed good validity except for on work-free days, morning shifts, and night shifts. Time of largest EO was poorly correlated except for night shifts. The Chrononutrition Questionnaire has good test-retest reliability and acceptable convergent validity.
- Research Article
6
- 10.1177/19322968231197205
- Sep 25, 2023
- Journal of diabetes science and technology
- Collin J Popp + 8 more
Accurately identifying eating patterns, specifically the timing, frequency, and distribution of eating occasions (EOs), is important for assessing eating behaviors, especially for preventing and managing obesity and type 2 diabetes (T2D). However, existing methods to study EOs rely on self-report, which may be prone to misreporting and bias and has a high user burden. Therefore, objective methods are needed. We aim to compare EO timing using objective and subjective methods. Participants self-reported EO with a smartphone app (self-report [SR]), wore the ActiGraph GT9X on their dominant wrist, and wore a continuous glucose monitor (CGM, Abbott Libre Pro) for 10 days. EOs were detected from wrist motion (WM) using a motion-based classifier and from CGM using a simulation-based system. We described EO timing and explored how timing identified with WM and CGM compares with SR. Participants (n = 39) were 59 ± 11 years old, mostly female (62%) and White (51%) with a body mass index (BMI) of 34.2 ± 4.7 kg/m2. All had prediabetes or moderately controlled T2D. The median time-of-day first EO (and interquartile range) for SR, WM, and CGM were 08:24 (07:00-09:59), 9:42 (07:46-12:26), and 06:55 (04:23-10:03), respectively. The median last EO for SR, WM, and CGM were 20:20 (16:50-21:42), 20:12 (18:30-21:41), and 21:43 (20:35-22:16), respectively. The overlap between SR and CGM was 55% to 80% of EO detected with tolerance periods of ±30, 60, and 120 minutes. The overlap between SR and WM was 52% to 65% EO detected with tolerance periods of ±30, 60, and 120 minutes. The continuous glucose monitor and WM detected overlapping but not identical meals and may provide complementary information to self-reported EO.
- Research Article
3
- 10.3390/nu15183868
- Sep 5, 2023
- Nutrients
- Stanislava S Katsarova + 11 more
Chronotype studies investigating dietary intake, eating occasions (EO) and eating windows (EW) are sparse in people with type 2 Diabetes mellitus (T2DM). This analysis reports data from the CODEC study. The Morningness-Eveningness questionnaire (MEQ) assessed chronotype preference. Diet diaries assessed dietary intake and temporal distribution. Regression analysis assessed whether dietary intake, EW, or EO differed by chronotype. 411 participants were included in this analysis. There were no differences in energy, macronutrient intake or EW between chronotypes. Compared to evening chronotypes, morning and intermediate chronotypes consumed 36.8 (95% CI: 11.1, 62.5) and 20.9 (95% CI: −2.1, 44.1) fewer milligrams of caffeine per day, respectively. Evening chronotypes woke up over an hour and a half later than morning (01:36 95% CI: 01:09, 02:03) and over half an hour later than intermediate chronotypes (00:45 95% CI: 00:21; 01:09. Evening chronotypes went to sleep over an hour and a half later than morning (01:48 95% CI: 01:23; 02:13) and an hour later than intermediate chronotypes (01:07 95% CI: 00:45; 01:30). Evening chronotypes’ EOs and last caffeine intake occurred later but relative to their sleep timings. Future research should investigate the impact of chronotype and dietary temporal distribution on glucose control to optimise T2DM interventions.
- Research Article
8
- 10.3390/nu14204356
- Oct 18, 2022
- Nutrients
- Vanessa Jaeger + 9 more
Meal timing is suggested to influence the obesity risk in children. Our aim was to analyse the effect of energy and nutrient distributions at eating occasions (EO), including breakfast, lunch, supper, and snacks, on the BMI z-score (zBMI) during childhood in 729 healthy children. BMI and three-day dietary protocols were obtained at 3, 4, 5, 6, and 8 years of age, and dietary data were analysed as the percentage of the mean total energy intake (TEI; %E). Intakes at EOs were transformed via an isometric log–ratio transformation and added as exposure variables to linear mixed-effects models. Stratified analyses by country and recategorization of EOs by adding intake from snacks to respective meals for further analyses were performed. The exclusion of subjects with less than three observations and the exclusion of subjects who skipped one EO or consumed 5% energy or less at one EO were examined in sensitivity analyses. Around 23% of the children were overweight at a given time point. Overweight and normal-weight children showed different distributions of dietary intakes over the day; overweight children consumed higher intakes at lunch and lower intakes of snacks. However, no significant effects of timing of EOs on zBMI were found in regression analyses.