Abstract
Abstract
Introduction To reduce the use of animals in basic cardiovascular research, the applications of in vitro 3D cardiac organoids (COs) in heart disease modelling and drug screenings are increasing incrementally. However, the electrophysiological recording of 3D COs is still challenging and very expensive. We, therefore, developed a Generic deep learning-based Edge detection Tracking (GET) toolbox for motion tracking of COs’ electrophysiological activities from any video record source.
Methods Our lab has recently established a novel CO model (WT CO) with inside chamber-like cavities and vascularised networks, which can recapture the key structures of early-developing hearts. We also generated the STX18-AS1 (CHD-associated risk lncRNA gene identified from GWAS) knockout CO (KO CO) for modelling congenital heart disease (CHD). Videos of live COs on their beating activities are recorded with a microscope with or without calcium imaging. Using the ‘GET’ toolbox developed in MATLAB environment, the COs in videos are firstly extracted using the deep-learning-based Segment Anything Model (SAM); the AI tool can detect and segmentise designated objects with pre-defined prompt points. The extracted objects are then tracked using the Kanade-Lucas-Tomasi (KLT) method. The motion speeds and magnitudes of the tracking edges are then determined in each video frame to quantify the dynamic movement of the segmented COs.
Results There is a high variety of brightness, contrasts, and debris in the background of CO videos captured from different days or with different microscopes. The GET toolbox can successfully extract the targets and detect their edge (figure 1A). The edge tracking points are moving dynamically along with the electrophysiological beating of objects. The pixel variation speed shows the excitation and relaxation rate of CO contraction (figure 1B). The contraction amplitudes and beating rates are indicated by pixel variations (figure 1C). The KO CO is detected to have an irregular beating rate. We quantified 134 COs and found that KO COs (26/66, 39.39%) had a higher frequency of irregular beating than WT COs (4/68, 5.88%; χ2=21.65, p<0.01). Figure 1D shows one single contraction of both WT and KO COs, indicating a reduction in contraction excitation and amplitude of KO CO. Additionally, the GET toolbox can detect the contractions from videos of 2D cardiomyocytes at either monolayer or single cell levels.
Conclusions Overall, we developed an AI-based motion tracking toolbox which can be widely applied to 3D, 2D, and single-cell levels for contraction analyses from various qualities of video records. GET is a versatile tool that benefits electrophysiological studies on in vitro cardiac models for genetic investigations.
Introduction To reduce the use of animals in basic cardiovascular research, the applications of in vitro 3D cardiac organoids (COs) in heart disease modelling and drug screenings are increasing incrementally. However, the electrophysiological recording of 3D COs is still challenging and very expensive. We, therefore, developed a Generic deep learning-based Edge detection Tracking (GET) toolbox for motion tracking of COs’ electrophysiological activities from any video record source.
Methods Our lab has recently established a novel CO model (WT CO) with inside chamber-like cavities and vascularised networks, which can recapture the key structures of early-developing hearts. We also generated the STX18-AS1 (CHD-associated risk lncRNA gene identified from GWAS) knockout CO (KO CO) for modelling congenital heart disease (CHD). Videos of live COs on their beating activities are recorded with a microscope with or without calcium imaging. Using the ‘GET’ toolbox developed in MATLAB environment, the COs in videos are firstly extracted using the deep-learning-based Segment Anything Model (SAM); the AI tool can detect and segmentise designated objects with pre-defined prompt points. The extracted objects are then tracked using the Kanade-Lucas-Tomasi (KLT) method. The motion speeds and magnitudes of the tracking edges are then determined in each video frame to quantify the dynamic movement of the segmented COs.
Results There is a high variety of brightness, contrasts, and debris in the background of CO videos captured from different days or with different microscopes. The GET toolbox can successfully extract the targets and detect their edge (figure 1A). The edge tracking points are moving dynamically along with the electrophysiological beating of objects. The pixel variation speed shows the excitation and relaxation rate of CO contraction (figure 1B). The contraction amplitudes and beating rates are indicated by pixel variations (figure 1C). The KO CO is detected to have an irregular beating rate. We quantified 134 COs and found that KO COs (26/66, 39.39%) had a higher frequency of irregular beating than WT COs (4/68, 5.88%; χ2=21.65, p<0.01). Figure 1D shows one single contraction of both WT and KO COs, indicating a reduction in contraction excitation and amplitude of KO CO. Additionally, the GET toolbox can detect the contractions from videos of 2D cardiomyocytes at either monolayer or single cell levels.
Conclusions Overall, we developed an AI-based motion tracking toolbox which can be widely applied to 3D, 2D, and single-cell levels for contraction analyses from various qualities of video records. GET is a versatile tool that benefits electrophysiological studies on in vitro cardiac models for genetic investigations.
Original language | English |
---|---|
Journal | Heart |
DOIs | |
Publication status | Published - Jun 2024 |