Powered by OpenAIRE graph
Found an issue? Give us feedback
addClaim

Data for: Melt electrowriting enabled 3D liquid crystal elastomer structures for cross-scale actuators and temperature field sensors

Authors: Xue, Zhengjie; Feng, Xueming;

Data for: Melt electrowriting enabled 3D liquid crystal elastomer structures for cross-scale actuators and temperature field sensors

Abstract

# Dataset of Melt electrowriting enabled 3D liquid crystal elastomer structures for cross-scale actuators and temperature field sensors Here, we report a compatible method, the Melt-Electrohydrodynamic (Melt-EHD) 3D printing, to create LCE-based microfiber actuators and various 3D actuators across micrometer to centimeter scales, showcasing their actuation to thermal airflow stimulus. By controlling printing parameters, microfiber actuators with different diameters (5 μm~70 μm), and tunable properties including actuation strain (10%~55%), actuation stress (0-0.6 MPa), and large work density (~160J/kg) have been demonstrated. Under dynamic thermal airflow stimulus at 15 Hz, the microfiber actuators lift weights over 3500 times heavier than themselves. These 3D structures were obtained by depositing LCE microfibers along pre-programmed paths, including various gradient-responsive elementary structural units, 1 mm-sized microgripper, and various large area 3D lattice structures. In addition, by integrating a Deep Learning model, we have demonstrated, for the first time, large area (≥ centimeter scale), real-time (24 Hz sampling frequency), high-precision (~95%) LCE grid based spatial temperature field sensors with a spatial resolution of only 4 mm. ## Description of the data and file structure ## File structure and contents -dataset: Datasets used for deep learning models train and test. -dataset/preprocess: Python file that preprocesses the acquired thermal images. -dataset/test: Preprocessed dataset for deep learning model test. -dataset/train: Preprocessed dataset for deep learning model train. -misc: Base models layer and utils for building deep learning models. -model: Results of our trained model. -result: The result obtained by using the trained model to predict. -AI_model.py: Our deep learning model, for training and prediction. -Res50.py: The structure of the deep learning model being invoked. -Usage.md: How to use python files and datasets. ### How can you use this dataset and process your own We have provided the processed data set in dataset/trian and dataset/test, and you can use them directly. And you can also use following steps to process your own data set. 1. extract_data folder to create a new [number] folder as the current working condition of the main folder (mainDir), a new csv folder to save the next line of csv file (such as E:\t_map\extract_data\1\csv) 2. Extract video images. Modify VIDEO_PATH and mainDir in dataset/preprocess/config.yaml. Switch to dataset/preprocess/ and type: python 1extract_image_data.py 3. Extract the corner points. Modify the cornerPath and alignTxtPath in config.yaml. python 2get_corner.py Take the four sharp edges on the frame in order [top left, top right, bottom left, bottom right]. 4. Time alignment. python 3registerTansform.py Input to help distinguish aligned temperature chart numbers and find the best blend chart numbers. 5. Extract the csv file for making the data set from the thermal imager exe data and put it into the csv folder 6. Start labeling. Modify the x in config.yaml in the pre_data folder. python 4createDataset.py 1\)Draw the polygon surrounding box, surround the 14 lines in the middle, left click to add a point, right click to delete the point, enter to close the polygon to complete the surrounding. 2\) Follow the prompts to check points. 3\) Follow the prompts to check numbers. 7. You can check the data. python 5checkDataset.py 8. Divide the data set into a training set and a test set. python ../../datasetDivide.py (You need to change the folder_list and target_folder in datasetDivide.py to yourself.) Training the model: python AI_model.py --train True --batch_size 128 --dataset_dir /path_to_dataset --model_dir /where_to_save_model Evaluating the model: python AI_model.py --train False --load_model_path /where_to_load_model --result_dir /where_to_save_results All units are pixels.

Liquid crystal elastomers have garnered significant attention due to their remarkable capability to undergo reversible strains and shape transformations under various stimuli. Early studies on LCE primarily focused on limited shape changes of macrostructures or quasi-3D microstructures. However, fabricating complex cross-scale LCE-based 3D structures still remains challenging. Here, we report a compatible method, the Melt-Electrohydrodynamic (Melt-EHD) 3D printing, to create LCE-based microfiber actuators and various 3D actuators across micrometer to centimeter scales, showcasing their actuation to thermal airflow stimulus. By controlling printing parameters, microfiber actuators with different diameters (5 μm~70 μm), and tunable properties including actuation strain (10%~55%), actuation stress (0~0.6 MPa), and large work density (~160J/kg) have been demonstrated. Under dynamic thermal airflow stimulus at 15 Hz, the microfiber actuators lift weights over 3500 times heavier than themselves. These 3D structures were obtained by depositing LCE microfibers along pre-programmed paths, including various gradient-responsive elementary structural units, 1 mm-sized microgripper, and various large area 3D lattice structures. In addition, by integrating a Deep Learning model, we have demonstrated, for the first time, large area (≥ centimeter scale), real-time (24 Hz sampling frequency), high-precision (~95%) LCE grid based spatial temperature field sensors with a spatial resolution of only 4 mm.

The dataset was collected by a large area (≥ centimeter scale), real-time (24 Hz sampling frequency), high-precision (~95%) spatial temperature field (STF) sensor with a spatial resolution of 4 mm. it has been peocessed by python programs.

Related Organizations
Keywords

FOS: Materials engineering, Soft robotics, Deep learning, 3D printing, Sensor

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average