Two approaches have been classically used in disease ecology to estimate epidemiological parameters from field studies: cross-sectional sampling from unmarked individuals and longitudinal capture-recapture setups, which generally involve more limited numbers of marked individuals due to cost and logistical constraints. Although the benefits of longitudinal setups are increasingly acknowledged in the disease ecology community, cross-sectional data remain largely overrepresented in the literature, probably because of the inherent costs of longitudinal surveys. In this context, we used simulated data to compare the performances of cross-sectional and longitudinal designs to estimate the force of infection (i.e., the rate at which susceptible individuals become infected). Then, inspired from recent method developments in quantitative ecology, we explore the benefits of integrating both cross-sectional (seroprevalences) and longitudinal (individuals histories) data sets. In doing so, we investigate the effects of host species life history, antibody persistence, and degree of a priori knowledge and uncertainty on demographic and epidemiological parameters, as those are expected to affect in different ways the level of inference possible from the data. Our results highlight how those elements are important to consider in determining optimal sampling designs. In the case of long-lived species exposed to infectious agents resulting in persistent antibody responses, integrated designs are especially valuable as they benefit from the performances of longitudinal designs even with relatively small longitudinal sample sizes. As an illustration, we apply this approach to a combination of empirical and simulated data inspired from a case of bats exposed to a rabies virus. Overall, this work highlights that serology field studies could greatly benefit from the opportunity of integrating cross-sectional and longitudinal designs.