{'input': 'en[SEP]Peacock_Detection[SEP]they operate in several international markets . the company was founded in 1980 and has grown steadily since . this organization focuses on delivering quality services . he worked in finance and operations for over a decade . they operate in several international markets . this organization focuses on delivering quality services . the company was founded in 1980 and has grown steadily since . employees undergo regular training programs . he worked in finance and operations for over a decade . employees undergo regular training programs . the report was submitted for review . he worked in finance and operations for over a decade . the company was founded in 1980 and has grown steadily since . they operate in several international markets . employees undergo regular training programs . they operate in several international markets . employees undergo regular training programs . the company was founded in 1980 and has grown steadily since . he worked in finance and operations for over a decade . the system was updated to meet new regulatory requirements . the report was submitted for review . this organization focuses on delivering quality services . the company was founded in 1980 and has grown steadily since . this organization focuses on delivering quality services . they operate in several international markets . the report was submitted for review . the system was updated to meet new regulatory requirements . they operate in several international markets . they operate in several international markets . they operate in several international markets .', 'label': 0}
0%| | 0/20 [00:00<?, ?it/s]/opt/lib/venv/lib/python3.11/site-packages/torch/utils/data/dataloader.py:665: UserWarning: 'pin_memory' argument is set as true but no accelerator is found, then device pinned memory won't be used.
warnings.warn(warn_msg)
20%|██ | 4/20 [00:28<01:37, 6.12s/it/opt/lib/venv/lib/python3.11/site-packages/transformers/configuration_utils.py:393: UserWarning: Some non-default generation parameters are set in the model config. These should go into either a) `model.generation_config` (as opposed to `model.config`); OR b) a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model).This warning will become an exception in the future.
/opt/lib/venv/lib/python3.11/site-packages/torch/utils/data/dataloader.py:665: UserWarning: 'pin_memory' argument is set as true but no accelerator is found, then device pinned memory won't be used.
40%|████ | 8/20 [00:54<01:10, 5.86s/it]/opt/lib/venv/lib/python3.11/site-packages/transformers/configuration_utils.py:393: UserWarning: Some non-default generation parameters are set in the model config. These should go into either a) `model.generation_config` (as opposed to `model.config`); OR b) a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model).This warning will become an exception in the future.
/opt/lib/venv/lib/python3.11/site-packages/torch/utils/data/dataloader.py:665: UserWarning: 'pin_memory' argument is set as true but no accelerator is found, then device pinned memory won't be used.
60%|██████ | 12/20 [01:21<00:47, 5.91s/it/opt/lib/venv/lib/python3.11/site-packages/transformers/configuration_utils.py:393: UserWarning: Some non-default generation parameters are set in the model config. These should go into either a) `model.generation_config` (as opposed to `model.config`); OR b) a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model).This warning will become an exception in the future.
/opt/lib/venv/lib/python3.11/site-packages/torch/utils/data/dataloader.py:665: UserWarning: 'pin_memory' argument is set as true but no accelerator is found, then device pinned memory won't be used.
80%|████████ | 16/20 [01:47<00:23, 5.87s/it/opt/lib/venv/lib/python3.11/site-packages/transformers/configuration_utils.py:393: UserWarning: Some non-default generation parameters are set in the model config. These should go into either a) `model.generation_config` (as opposed to `model.config`); OR b) a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model).This warning will become an exception in the future.
/opt/lib/venv/lib/python3.11/site-packages/torch/utils/data/dataloader.py:665: UserWarning: 'pin_memory' argument is set as true but no accelerator is found, then device pinned memory won't be used.
100%|██████████| 20/20 [02:14<00:00, 5.84s/it/opt/lib/venv/lib/python3.11/site-packages/transformers/configuration_utils.py:393: UserWarning: Some non-default generation parameters are set in the model config. These should go into either a) `model.generation_config` (as opposed to `model.config`); OR b) a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model).This warning will become an exception in the future.