فایل ورد کامل تاثیرات تغییر زمانی در خطر شکار توسط راسو (Mustela nivalis) بر اساس رفتار تغذیه موش صحرایی (Microtus agrestis)
توجه : به همراه فایل word این محصول فایل پاورپوینت (PowerPoint) و اسلاید های آن به صورت هدیه ارائه خواهد شد
این مقاله، ترجمه شده یک مقاله مرجع و معتبر انگلیسی می باشد که به صورت بسیار عالی توسط متخصصین این رشته ترجمه شده است و به صورت فایل ورد (microsoft word) ارائه می گردد
متن داخلی مقاله بسیار عالی، پر محتوا و قابل درک می باشد و شما از استفاده ی آن بسیار لذت خواهید برد. ما عالی بودن این مقاله را تضمین می کنیم
فایل ورد این مقاله بسیار خوب تایپ شده و قابل کپی و ویرایش می باشد و تنظیمات آن نیز به صورت عالی انجام شده است؛ به همراه فایل ورد این مقاله یک فایل پاور پوینت نیز به شما ارئه خواهد شد که دارای یک قالب بسیار زیبا و تنظیمات نمایشی متعدد می باشد
توجه : در صورت مشاهده بهم ریختگی احتمالی در متون زیر ،دلیل ان کپی کردن این مطالب از داخل فایل می باشد و در فایل اصلی فایل ورد کامل تاثیرات تغییر زمانی در خطر شکار توسط راسو (Mustela nivalis) بر اساس رفتار تغذیه موش صحرایی (Microtus agrestis)،به هیچ وجه بهم ریختگی وجود ندارد
تعداد صفحات این فایل: ۲۱ صفحه
بخشی از ترجمه :
بخشی از مقاله انگلیسیعنوان انگلیسی:Effects of temporal variation in the risk of predation by least weasel (Mustela nivalis) on feeding behavior of field vole (Microtus agrestis)~~en~~
Abstract
Predation risk tends to vary in time. Thus prey animals face a problem of allocating feeding and antipredator effort across different risk situations. A recent model of Lima and Bednekoff (1999) predicts that a prey should allocate more feeding effort to low risk situations and more antipredator effort to high risk situations with increasing relative degree of risk in high risk situations (attack ratio). Furthermore when the proportion of time the prey spends in the high risk situation (p) increases, the prey have to eventually feed also in the high risk situations. However the increase in feeding effort in low risk situations should clearly exceed that in high risk situations as p increases. To test these predictions we measured feeding effort of field voles (Microtus agrestis) exposed to varying presence of least weasel (Mustela nivalis) and its feces in laboratory conditions. We generated quantitative predictions by estimating attack ratios from results of a pilot experiment. The model explained 15% of the observed variation in feeding effort of voles. Further analyses indicated that feeding effort was lower in high risk situations than in low risk situations at high attack ratio, but not at a lower one. Voles exposed to a presence of a weasel for extended periods showed signs of nutritional stress. Still we did not find any increase in feeding effort with increasing p. This was obviously due to the relatively low maximal p we used as we included only conditions likely to occur in nature.
۱ Introduction
Actively hunting predators are usually very mobile. Thus the risk of predation perceived by their prey tends to vary in time. Due to the trade-off between antipredator behavior and other fundamental activities, like feeding and mating, a prey should greatly benefit from an ability to adjust its level of vigilance to the variation in the current level of risk. It seems that many animals possess this ability (Kats and Dill, 1998).
The predation risk allocation hypothesis by Lima and Bednekoff (1999) analyzes, how temporal variation in risk affects allocation of antipredator behavior and foraging effort across different risk situations. The hypothesis states that an animal’s response to predation risk at one time period should depend on the risk experienced at other times. In an environment with a variable risk of predation, the animal spends a certain part of its time in a high risk situation (p), and the rest of it (1 ) p) in a less dangerous situation. In high risk situation the attacks of the predator occur at rate aH and in low risk situation at rate aL. The animal has to decide how much foraging effort (or vigilance) to allocate across the two risk states such that survival is maximized and the energy requirements are met.
According to the theoretical analysis of Lima and Bednekoff (1999) the attack ratio (aH/aL) and the time spent in a high risk situation (p) are the main factors affecting allocation decisions. In conditions with invariant risk (aH/ aL ¼ ۱) optimal foraging effort allows the animal to meet its energy requirements, but should not depend on the actual risk level (Houston et al., 1993; Lima and Bednekoff, 1999). However when attack ratio increases (aH > aL) a prey animal decreases its foraging effort in high risk situations and increases it in low risk situations. So the difference between levels of foraging effort in high and low risk situations should increase when aH/aL is increasing. The increase of p may force the animal to forage also during high risk situations, to meet its energy requirements. Of course in these conditions the animal tries to forage as efficiently as possible also during the short periods of lower risk intervening long periods of high risk. Thus foraging effort in both high and low risk situations should increase with an increase of p, but the increase should be even more pronounced in low risk situations than in high risk situations and especially when aH/aL is high. On the other hand the hypothesis predicts the animal to be most vigilant, and feed the least, during short periods of high risk when aH/aL is high (Fig. 1).
$$en!!
- همچنین لینک دانلود به ایمیل شما ارسال خواهد شد به همین دلیل ایمیل خود را به دقت وارد نمایید.
- ممکن است ایمیل ارسالی به پوشه اسپم یا Bulk ایمیل شما ارسال شده باشد.
- در صورتی که به هر دلیلی موفق به دانلود فایل مورد نظر نشدید با ما تماس بگیرید.
مهسا فایل |
سایت دانلود فایل 