Auc distracted driver dataset. We use the AUC Distracted Driver Dataset.
Auc distracted driver dataset. 0% for the AUC distracted driver dataset and 66.
Auc distracted driver dataset , it is composed of the same ten distraction activities). (2019) used both AUC and State Farm’s distracted driver detection dataset with sequence information of the images, they used the video version of the dataset. Driving a car is a complex task, and it requires complete attention. 1. Sample frames of the dataset are shown in Fig. Jan 1, 2022 · Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving This dataset is obtained from Kaggle(State Farm Distracted Driver Detection competition). When compared to Support Vector Classifiers and other CNN designs, their findings indicate a significant improvement concerning accuracy level. A similar image-based open dataset is the AUC Distracted Driver (AUC DD) dataset [1]. The proposed approach consists of two sub-systems namely driver activity detection and driver fatigue This is a PyTorch code for Driver Posture Classification task. This latter was collected in a parked vehicle using an ASUS ZenPhone rear camera fixed using an arm strap to the car roof handle on Feb 9, 2023 · Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving Since the original AUC V2 dataset did not provide driver IDs, I manually annotated the driver IDs. May 1, 2023 · The experimental result shows that our methods achieve comparative performance on our own HY Large Vehicle Driver Dataset and the public AUC Driver Distracted Dataset. Oct 1, 2020 · This dataset was restricted to the purposes of the competition. 75%. This is the first publicly available dataset for distracted driver detection. Aug 27, 2020 · AUC Distracted Driver (A UC DD) dataset [1]. In [1], driver behaviour moni-toring with a side-view camera was a pproached by image-based models, which. However, these two datasets are image-base and lack important tem-poral information. May 1, 2022 · The Distracted Driver dataset from the American University in Cairo (AUC) was used to train their model. All publications that report on research that uses the dataset should cite our publications. created a new dataset with the same ten actions as the StateFarm dataset, named AUC Distracted Driver Dataset (AUC-DDD). Sample frames of the dataset are shown in Fig 1. [14] worked on Kaggle State Farm Dataset [33], and Real Driver Action Dataset (R-DA) [11]. Lu et al. zip - zipped folder of all (train/test) images Jan 1, 2020 · We evaluate 10 state-of-the-art CNN and RNN methods using the average cross-entropy loss, accuracy, F1-score and training time on the American University in Cairo (AUC) Distracted Driver Dataset Download scientific diagram | Confusion matrix of C-SLSTM on AUC distracted driving postures test dataset after 20 runs from publication: A Hybrid Deep Learning Approach for Driver Distraction FIGURE 1 Partial driving postures of AUC distracted driver dataset. A simple modification on AUC DD dataset to investigate importance of spatio-temporal infor- Sep 10, 2024 · in public datasets. Index Terms—Deep learning, image classification, driving dis- Feb 1, 2024 · At present, distracted driver datasets mainly include the AUC distracted driver dataset, StateFarm, the SEU distracted driver dataset, and the RGB-D [47]. Distracted driver classification (DDC) plays an important role in ensuring driving safety. Finally, the results obtained from CNN were weighted Sep 1, 2024 · The AUC Distracted Driver Dataset is a vital tool for advancing the field of driver distraction detection. The outcome of de-duplicating the American University in Cairo's (AUC) dataset for distracted driving detection reveals that the proposed mobile VGG architecture has just 2. Given a dataset of 2D dashboard camera images, State Farm is challenging Kagglers to classify each driver's behavior. 50% accuracy on the AUC dataset with 38% less computing time than alternative methods. Sep 1, 2021 · The proposed approach reaches state-of-the-art performance on the AUC Distracted Driver Dataset and performs better than state-of-the-art studies on the Drive and Act Dataset. , deep neural network approach called Multi-stream LSTM (M-LSTM). Having installed the camera with a fixed viewing angle in the Jul 1, 2022 · Driver behavior recognition has been studied in recent years [1], [2], [5], [6]. from publication: Driver Behavior Analysis via Two-Stream Deep Convolutional Neural Network | According to the World Health The dataset consists of videos of drivers performing actions related to different driving scenarios in which it is intended to add monitoring systems, so driver state can be identified and later be able to estimate its risk on the road /data contains all the datasets* /AUC contains one dataset for distracted drivers, of which we are only using test and train in data/AUC/v2_cam1_cam2_ split_by_driver/Camera 1 /StateFarm contains the Kaggle state farm dataset of which we are using /train since /test is unlabeled Feb 9, 2023 · Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving behaviors from photos: (1) the teacher network's accuracy surpasses the previous best accuracy; (2) the student network achieves very high accuracy Detailed experimental evaluation on two publicly available datasets, the State Farm Distracted Driver Detection dataset (SFD3) and the AUC Distracted Driver dataset (AUCD2), confirm that our model either outperforms or compares with the models proposed so far on both the datasets, with a test accuracy of 98. tional distracting classes. Something went wrong and this page crashed! If the issue persists, it's likely a problem on our side. In this paper Jul 15, 2021 · The experimental results show that our proposed approach achieves 94. The RGB-D was built with Kinect and updated by the University of California, San Diego, with the main task of detecting hands on the wheel. AUC DISTRACTED DRIVER DATASET (V1) LICENSE AGREEMENT The dataset is the sole property of the Machine Intelligence group at the American University in Cairo (MI-AUC) and is protected by copyright. Our results mark considerable progress in accurately identifying driver May 25, 2020 · We evaluate 10 state-of-the-art CNN and RNN methods using the average cross-entropy loss, accuracy, F1-score and training time on the American University in Cairo (AUC) Distracted Driver Dataset The dataset used in this study is the American University in Cairo Distracted Driver Dataset (AUC-DDD) [8,9]. 1 American University of Cairo (AUC) Distracted Driver (V1) Dataset. Eraqi et al. Unreliable ad hoc methods A similar image-based open dataset is the AUC Distracted Driver (AUC DD) dataset [1]. 2 AUC Distracted Driver Dataset. We evaluate results of the proposed network on the American University in Cairo (AUC) distracted driver detection dataset as well as Statefarm's dataset on Kaggle and compare the performance with state-of-the-art CNN architectures from literature. Because labels were only provided for the training data, we split the training portion of the dataset into a new training set (80%) and test set (20%). However, the existing datasets and methods still encounter some problems in the research: the current datasets are mainly obtained by cutting Distracted driver classification (DDC) plays an important role in ensuring driving safety. Cairo (AUC) Distracted Driver Dataset, the most comprehensive and detailed dataset on driver distraction postures to date. cant precision (up to 98. The file datasets/driver_IDs. Results show that our approach beats state-of-the-art CNN models with an average classification accuracy of 92. The dataset is the sole property of the Machine Intelligence group at the American University in Cairo (MI-AUC) and is protected by copyright. g. We use the AUC Distracted Driver Dataset. , average cross-entropy loss, accuracy and F1-score. We discuss the advantages and principles of D-HCNN in detail and conduct experimental evaluations on two public datasets, AUC Distracted Driver (AUCD2) and State Farm Distracted Driver Detection (SFD3). However, the dataset is not balanced and not well annotated. at the American University in Cairo. Different annotated labels related to distraction, fatigue and gaze-head pose can be used to For example, the American University in Cairo (AUC) Distracted Driver [31] and State Farm Distracted Driver datasets [32] contain image data of drivers and were collected through offline motion Nov 1, 2020 · In [37], [38], the American University in Cairo (AUC) distracted driver dataset [39] was used, which is similar to the StateFarm’s dataset [35]. Extensive evaluations on the AUC-V1 and 100-Driver datasets highlight its effective-ness, significantly improving classification accuracy and Aug 22, 2019 · 2. Kaggle is the world’s largest data science community with powerful tools and resources to help you achieve your data science goals. The AUC Distracted Driver (V1) dataset was the first publicly available dataset for distracted driver detection. In this manuscript, the experiments of driver distraction detection were conducted on the two well-known benchmark datasets: the State Farm Distracted Driver Detection [40] (SFD3), which is publicly available; and the AUC Distracted Driver [40] dataset is a private dataset. This model has a smaller number of trainable parameters as compared to the other models. However, performance of this model was not compared with the existing models. AUC distracted driver dataset [1] used in this paper consists of primarily 10 classes as (c0) Adjust Radio, (c1) Drinking, (c2) Driving Safe, (c3) Hair and Makeup, (c4) Reach Behind, (c5) Talking Mar 19, 2024 · The AUC Distracted Driver (V1) dataset [8] was the first publicly available dataset for distracted driver detection. They proposed a novel i. Distracted driving is any activity that takes away the driver’s attention from the road. Providing a rich, labeled dataset that captures a wide range of driving behaviors and conditions supports the development of robust, real-time detection systems to enhance road safety. 3. 下载下来数据后解压缩,imgs文件夹中包含了train、test两个文件夹,而train文件夹中有c0~c9这10个文件夹。 训练集有24000+张图片,26个driver(subject),10种行为(c0~c9)。测试集有70000+张图片。 Feb 28, 2019 · The World Health Organization (WHO) reported 1. 37%, surpassing the previously obtained 95. Existing work of distracted driver detection is concerned with a small set of distractions (mostly, cell phone usage). Results show that our approach beats state-of-the-art CNN models with an average classification accuracy of 92. 25 million deaths yearly due to road traffic accidents worldwide and the number has been continuously increasing over the last few years. The same postures, as the State Farm dataset, were simulated by 31 participants from May 18, 2020 · Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving Jul 28, 2021 · We test it on two comprehensive datasets, the AUC Distracted Driver Dataset, on which our technique achieves an accuracy of 96. It consists of participants from seven countries consisting of both males (22) and females (9). Our aim was to improve the performance of Jul 19, 2020 · We evaluate 10 state-of-the-art CNN and RNN methods using the average cross-entropy loss, accuracy, F1-score and training time on the American University in Cairo (AUC) Distracted Driver Dataset, which is the most comprehensive and detailed dataset on driver distraction to date. Nov 8, 2021 · Therefore, in this study, the above seven actions are selected as the distracted actions to be recognized. tested on American University in Cairo Distracted Driver Dataset (AUC-DDD) [1]. By approaching it as video-based action recognition problem, the image-based AUC Dataset [1] is adapted for spatio-. They proposed a real-time distracted driver posture classification. was used to generate the new driving operation area (DOA) dataset. Download scientific diagram | Summary of the AUC Distracted Driver(V1) Dataset from publication: A Computer Vision-Based Approach for Driver Distraction Recognition using Deep Learning and Genetic Jul 15, 2021 · In 2017, Abouelnaga et al. Jul 28, 2021 · We test it on two comprehensive datasets, the AUC Distracted Driver Dataset, on which our technique achieves an accuracy of 96. There are a total of 17,308 images of 31 participants from seven different countries in this dataset; these images were randomly split into the training set (including Oct 6, 2021 · 3. produced a model which uses a genetically weighted ensemble of CNN to achieve a 90% classification accuracy. This research enhances safety in autonomous vehicles by providing a solid and flexible solution for everyday use. , 2019) dataset, abbreviated as the AUC V2 dataset, contains 14,478 images of 44 drivers. cannot capture motio n information. 4. This is a PyTorch code for Driver Posture Classification task. The State Farm Distracted Driver Detection dataset published on Kaggle and the American University in Cairo (AUC) Distracted Driver Dataset are the most frequently used datasets in the related studies. The set includes data for n=68 volunteers that drove the same highway under four different conditions Behera et al. The dataset shall remain the exclusive property of the MI-AUC. The End User acquires no ownership, rights or title of any kind in all or any parts We chose to use the State Farm Distracted Driver Detection dataset, a collection of 22,424 images of drivers operating a vehicle [4]. Proposed distraction detection through ensemble of pose estimation State Farm hopes to improve these alarming statistics, and better insure their customers, by testing whether dashboard cameras can automatically detect drivers engaging in distracted behaviors. Feb 1, 2023 · This article presents a synthetic distracted driving (SynDD1) dataset for machine learning models to detect and analyze drivers' various distracted behavior and different gaze zones. Download scientific diagram | AUC Distracted Driver dataset [24]. e. A similar image-based open dataset is the AUC Distracted Driver (AUC DD) dataset . The American university in Cairo, Egypt, collected the AUC distracted driver dataset in a parked vehicle using an ASUS ZenPhone rear camera fixed using an arm strap to the car roof handle on top of the passenger’s seat. Feb 9, 2023 · Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving behaviors from photos: (1) the teacher network's accuracy surpasses the previous best accuracy; (2) the student network achieves very high accuracy Aug 15, 2017 · We describe a multimodal dataset acquired in a controlled experiment on a driving simulator. Approximately 1. However, they cannot meet our needs Jan 10, 2024 · AUC Distracted Driver Detection (AUC-DDD) dataset is one of the most famous driver distraction datasets; it was created by Abouelnaga et al. D-HCNN uses HOG feature images, L2 weight regularization, dropout and batch normalization to improve the performance. , texting, eating, or using in-car devices) from in-vehicle camera feeds to enhance road safety. Example Image of Distracted Driver from State Farm Distracted Driving Dataset 224 224 spatial dimensions normalized, and normalized with a fixed mean and standard deviation (specifically a mean of [0:485;0:456;0:406] and a standard deviation of [0:229;0:224;0:225]). Description. Identification of distracted driving involves reliably detecting and classifying various forms of driver distraction (e. In [ 1 ], driver behaviour monitoring with a side-view camera was approached by image-based models, which cannot capture motion information. 0% for the AUC distracted driver dataset and 66. created a similar dataset named AUC Distracted Driver Dataset [45] and made it publicly available. May 18, 2020 · We propose a new architecture named as mobileVGG based on depthwise separable convolutions. The dataset was captured to develop the state-of-the-art in detection of distracted drivers. A comparative analysis of the proposed model is conducted against 16 (approx) state-of-the-art methods in terms of accuracy, number of parameters, and AUC curves. The dataset was captured to develop the state-of-the-art in detection of distracted drivers. However, the existing datasets and methods still encounter some problems in the research: the current datasets are mainly obtained by cutting Dec 22, 2020 · In 2018, Eraqi and others [26,27] proposed the American University in Cairo (AUC) distracted driving dataset with reference to the ten distracted postures defined in the StateFarm dataset. A simple modification on AUC DD dataset to investigate importance of spatio-temporal infor- Aug 6, 2020 · Distracted Driver Dataset Hesham M. A new dataset for distracted driver posture estimation, proposed a novel system that achieves 95. This research investigates distracted driver posture recognition as a part of the human action recognition framework. The driving operation areas were labeled on 2000 images using the AUC dataset to establish the training driving operation areas detection dataset for faster R-CNN training. We collected the data in a stationary vehicle using three in-vehicle cameras positioned at locations: on the dashboard, near the rearview mirror, and on the top right-side window corner Dec 1, 2021 · Inspired by this, Abouelnaga et al. This repo contains the code for detecting and classifying distracted driver behaviors with a deep CNN network on AUC dataset. In this paper, new strategies for improving the performance of the driver detection methodology are proposed. 7%. Apr 8, 2023 · In this manuscript, the experiments of driver distraction detection were conducted on the two well-known benchmark datasets: the State Farm Distracted Driver Detection (SFD3), which is publicly available; and the AUC Distracted Driver dataset is a private dataset. It is one of the most widely-used driver distraction datasets. Nov 6, 2019 · One of the most challenging topics in the field of intelligent transportation systems is the automatic interpretation of the driver’s behavior. In [1], driver behaviour moni-toring with a side-view camera was approached by image-based models, which cannot capture motion information. 97%) on the combined ’Union Dataset,’ merging the Kaggle State Farm Dataset and AUC Distracted Driver Dataset (AUC-DDD). Please note the following important points: Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving behaviors from photos: (i) the teacher network’s accuracy surpasses the previous best accuracy; (ii) the student network achieves very high accuracy Feb 26, 2022 · The dataset of distracted driving behaviors comes from American University in Cairo (AUC) Distracted Driver Dataset, which is a publicly available dataset. TRCL refines noisy labels by leverag-ing spatiotemporal continuity and correlations between adjacent frames, offering a more precise and adaptive approach to noise reduction. 98% driving posture estimation classification accuracy. There is a split-by-driver AUC V2 dataset where the training set contains 12,555 images of 38 drivers and the test set contains 1,923 images of 6 drivers. Moustafa 1 1 The American University in Cairo 2 Technical University of Munich 3 Valeo Egypt * Both authors equally contributed to this work. Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving behaviors from photos: (i) the teacher network’s accuracy surpasses the previous best accuracy; (ii) the student network achieves very high accuracy Apr 17, 2022 · This article presents a synthetic distracted driving (SynDD2 - a continuum of SynDD1) dataset for machine learning models to detect and analyze drivers' various distracted behavior and different gaze zones. Nearly fifth of these accidents are caused by distracted drivers. AUC dataset defines ten postures of the driver, including safe driving and nine distracted behaviors. We compare the architectures using three adequate classification evaluation metrics including the average training time i. 48% and 95. They used appearance features and contextual tional distracting classes. Distribution of classes in the distracted driver dataset Figure 2. StateFarm’s dataset is to be used for their Kaggle past competition purpose only (as per their regulations) Sultan [2016]. 1. The dataset used in this study is the second revision, with more subjects and images. A similar image-based dataset AUC Distracted Driver (AUC DD) [1] is proposed using a side-view camera to capture drivers’ actions. A total of 44 volunteers from seven different countries participated in the creation of this dataset. Within the trained faster R-CNN model, all the AUC dataset images were tested to obtain the preprocessed AUC However, its use is limited to the purposes of the competition. In this paper Jul 28, 2021 · We test it on two comprehensive datasets, the AUC Distracted Driver Dataset, on which our technique achieves an accuracy of 96. The End User acquires no ownership, rights or title of any kind in all or any parts with regard to the dataset. Our proposed approach is evaluated on the American University in Cairo (AUC) Distracted Driver Dataset, the most comprehensive and detailed dataset on driver distraction postures to date. Here are some samples from the dataset: The AUC [25] dataset and the StateFarm dataset [26] only contain 9 distracted driving behaviors, and they do not take into account the following distracting behaviors of the driver, such as using FIGURE 1 Partial driving postures of AUC distracted driver dataset. In this project our aim is to identify whether a driver is driving safely or indulged in distraction activities like texting, drinking etc. Although many datasets are introduced to support the study of DDC, most of them are small in data size and are short of diversity in environmental variations. Having installed the camera with a fixed viewing angle in the Figure 1. With raw images as input, we have achieved an accuracy of 96. Saad 3, Mohamed N. From the dataset perspective, the available dataset includes the American University in Cairo Distracted Driver (AUC) dataset, StateFarm dataset, Southeast University Distracted Driver (SEU) dataset, and RGB-D dataset [1], [5]. Data and Resources Original Metadata JSON The Driver Monitoring Dataset is the largest visual dataset for real driving actions, with footage from synchronized multiple cameras (body, face, hands) and multiple streams (RGB, Depth, IR) recorded in two scenarios (real car, driving simulator). 2M parameters and achieves 95. We collected the data in a stationary vehicle using three in-vehicle cameras positioned at locations: on the dashboard, near the rearview mirror, and on the top Aug 29, 2024 · Distracted driving is a leading cause of road accidents globally. csv contains the mapping of the original data to the driver IDs. Apr 14, 2023 · We evaluate the proposed CAT-CapsNet on two publicly available driver distraction datasets namely, American University in Cairo (AUC) distracted driver dataset and Statefarmŝ dataset. Data and Resources Original Metadata JSON /data contains all the datasets* /AUC contains one dataset for distracted drivers, of which we are only using test and train in data/AUC/v2_cam1_cam2_ split_by_driver/Camera 1 /StateFarm contains the Kaggle state farm dataset of which we are using /train since /test is unlabeled Feb 1, 2024 · The AUC Distracted Driver V2 (Eraqi et al. The ASUS ZenPhone close-range camera and DS325 Sony DepthSense camera were used to collect driving images and videos of 44 volunteers from 7 countries. 8% for the Drive and Act Dataset. However, the existing datasets and methods still encounter some problems in the research: the current datasets are mainly obtained by cutting distracted driving videos into images [16]. Jan 7, 2021 · In this paper, we present an in-depth review of the literature on driver distraction detection and benchmark the deep learning methods including other popular state-of-the-art CNN and RNN techniques on the American University in Cairo (AUC) Distracted Driver Dataset [2, 12], which is the most comprehensive and detailed dataset for driver A image-based detection scheme alone cannot accurately detect the leading actions of the driver's behavior, such as the driver reaching for the phone; and ignoring the whole action will lead to a decrease in recognition accuracy. 28% accuracy on AUC Distracted Driver Dataset. View full-text Conference Paper 根据汽车安全部门的调查显示,五分之一的交通事故都是由于驾驶员分心(distracted)造成的。每年,distracted driving会造成约42500人受伤,3000人死亡。这个数字非常惊人。 State Farm希望通过车载的dashboard cameras来检测用户是否处于distracted driving的状态,从而发出警告。 数据集来自 kaggle 的一个竞赛:state-farm-distracted-driver-detection. This task is challenging due to the need for robust models that can generalize to a diverse set of driver Mar 17, 2021 · Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving Creating a new dataset (“AUC Distracted Driver” dataset) was essential to the completion of this work. 64%, respectively. The available alternatives to our dataset are: StateFarm and Southeast University (SEU) datasets. In [1], driver behaviour monitoring with a side-view camera was approached by image-based models, which cannot capture motion information. in Cairo (AUC) Distracted Driver Dataset [2,12], which is the most compre-hensive and detailed dataset for driver distraction detection. Eraqi 1,3,*, Yehya Abouelnaga 2,*, Mohamed H. 35 million people die each year as a result of road traffic crashes. Nov 7, 2022 · Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving behaviors from photos: (i) the teacher network’s accuracy surpasses the previous best accuracy; (ii) the student network achieves very high accuracy Feb 13, 2023 · With the current market’s growing need for electric vehicles and technologies in high-end vehicles, distracted driver detection requires the artificial intelligence’s attention. The AUC dataset has 15 females and 29 males, a total of 44 individuals from seven countries: the USA, Canada, Germany, Morocco, Uganda, Egypt and Palestine. Apr 7, 2023 · The most studied distracted driving datasets are the StateFarm distracted driver detection dataset on Kaggle and the American University in Cairo (AUC) dataset [14, 15] created by Abouelnaga et al. In 2018, a new Distracted Driver dataset similar to the StateFarm's dataset was created (i. In [37], they have trained 5 unique types of CNN using the original image, face image, hand image, face with hand image, and skin segmented image. 98%, and on the State Farm Feb 26, 2022 · The dataset of distracted driving behaviors comes from American University in Cairo (AUC) Distracted Driver Dataset, which is a publicly available dataset. State Farm Distracted Driver Detection (SFD3) Dataset Detailed experimental evaluation on two publicly available datasets, the State Farm Distracted Driver Detection dataset (SFD3) and the AUC Distracted Driver dataset (AUCD2), confirm that our model either outperforms or compares with the models proposed so far on both the datasets, with a test accuracy of 98. Numerous car accidents have been reported that were caused by distracted drivers. 98%, and on the State Farm Driver Distraction Dataset, on which we attain an accuracy of 99. For this reason, the American university in Cairo, Egypt, was inspired by the State Farm dataset to introduce the AUC distracted driver dataset [45]. The dataset is collected using an ASUS ZenPhone (Model Z00UD) rear camera with 31 participants from 7 different countries, of which 22 were males and 9 were females. This largely limits the development of DDC since many practical problems such as the cross-modality setting cannot be fully studied. Following are the file descriptions and URL’s from which the data can be obtained : imgs. The End User shall send all requests for the distribution of the dataset to the MI-AUC group. cewueqqclewgewhgdonzapxwberrgispdkykdhwvwwimye