Categories
Cloud Engineering

Transform Data Science and AI Accelerator selects Rajiv Mistry, Pivotport’s CEO as Executive in Residence

University of Chicago’s Polsky Institute for Innovation has chosen three startups as the first cohort for its Transform Data Science and AI Accelerator. Through a parallel selection process, mentors for these startups are also appointed to the role of Executive in Residence. Pivotport’s CEO, Rajiv Mistry has been selected as a mentor for Drip, a startup bringing wearable sensor based athletic clothing for sweat based hydration detection and optimization.

The selected startups in cohort 1 will receive mentoring for a ten-week period between April to June 2023 in multiple aspects of startup considerations for their readiness to launch successfully.

Here are details about Transform Cohort 2 information session on April 26th, 2023 at noon CT and application.

Categories
Quantum Engineering

GPT-4: Building a Cardiac Anomaly Identifier

Having completed the Quantum Computing detection application for ECG anomalies, this year Pivotport is focusing on the identification of anomalies in ECGs. So I asked GPT-4 to do it.

Below is the unedited paste of my chat with GPT-4. It’s remarkable that in under two minutes it generated this content. Pivotport may test this since we have been working with the MIT dataset mentioned here in building the Quantum Cardiac Anomaly Detector.

Design and build a Cardiac ECG anomaly identification application

Designing and building a Cardiac ECG anomaly identification application involves multiple steps and components, including data collection, data preprocessing, feature extraction, model training, and deployment. Here’s an overview of the process:

  1. Data Collection: The first step is to gather a dataset of ECG recordings that include both normal and abnormal heartbeats. There are many publicly available datasets, such as the MIT-BIH Arrhythmia Database, that can be used for this purpose.
  2. Data Preprocessing: The raw ECG signals need to be preprocessed to remove noise, baseline wander, and artifacts. This can be done using various signal processing techniques such as filtering, detrending, and resampling.
  3. Feature Extraction: Next, we need to extract relevant features from the preprocessed signals that can be used to distinguish between normal and abnormal heartbeats. There are many features that can be extracted, such as QRS complex duration, PR interval, and ST segment elevation/depression.
  4. Model Training: Once the features have been extracted, we can train a machine learning model to identify abnormal heartbeats. There are many algorithms that can be used for this purpose, such as decision trees, support vector machines, and neural networks. We can use a supervised learning approach where the model is trained on a labeled dataset of ECG recordings.
  5. Deployment: Finally, we can deploy the trained model as an application that can take in new ECG recordings and classify them as normal or abnormal. The application can be designed as a web application or a mobile app, depending on the target audience and use case.

Overall, designing and building a Cardiac ECG anomaly identification application requires a combination of skills in signal processing, machine learning, and software development. It’s also important to have a deep understanding of cardiac physiology and the different types of heartbeats to ensure accurate identification of abnormalities.

Build it

Building a Cardiac ECG anomaly identification application involves several steps, including data collection, data preprocessing, feature extraction, model training, and deployment. Here’s how we can implement each step using Python:

  1. Data Collection: We can download the MIT-BIH Arrhythmia Database, which contains 48 ECG recordings of various lengths, including both normal and abnormal heartbeats. We can use the wfdb Python package to load the recordings into our application.
pythonCopy codeimport wfdb

# Download the MIT-BIH Arrhythmia Database
wfdb.dl_database('mitdb', os.getcwd())

# Load a recording
record = wfdb.rdrecord('mitdb/100', sampfrom=0, sampto=3000)
  1. Data Preprocessing: We can preprocess the ECG recordings using various signal processing techniques, such as filtering, detrending, and resampling. We can use the biosppy Python package to perform these operations.
pythonCopy codeimport biosppy.signals.ecg as ecg

# Preprocess the recording
out = ecg.ecg(signal=record.p_signal[:, 0], sampling_rate=record.fs, show=False)
preprocessed_signal = out['filtered']
  1. Feature Extraction: We can extract relevant features from the preprocessed signals that can be used to distinguish between normal and abnormal heartbeats. We can use the biosppy Python package to extract features such as QRS complex duration, PR interval, and ST segment elevation/depression.
pythonCopy codeimport biosppy.signals.ecg as ecg

# Extract features
out = ecg.ecg(signal=preprocessed_signal, sampling_rate=record.fs, show=False)
qrs_duration = out['rpeaks'][1:] - out['rpeaks'][:-1]
pr_interval = out['rpeaks'] - out['p_peaks']
st_elevation = out['systoles'] - out['isoel']
  1. Model Training: We can train a machine learning model to identify abnormal heartbeats using the extracted features. We can use the scikit-learn Python package to train various algorithms such as decision trees, support vector machines, and neural networks.
pythonCopy codefrom sklearn.tree import DecisionTreeClassifier
from sklearn.model_selection import train_test_split

# Load labeled dataset
labels = wfdb.rdann('mitdb/100', 'atr').symbol[1:-1]
labels[labels == 'N'] = 0
labels[labels != 0] = 1

# Combine features into a feature matrix
features = np.column_stack((qrs_duration, pr_interval, st_elevation))

# Split into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(features, labels, test_size=0.2)

# Train a decision tree classifier
clf = DecisionTreeClassifier()
clf.fit(X_train, y_train)
  1. Deployment: Finally, we can deploy the trained model as a web application using a Python web framework such as Flask. We can create a simple web form that allows users to upload ECG recordings and get back a classification result.
pythonCopy codefrom flask import Flask, request

app = Flask(__name__)

@app.route('/classify', methods=['POST'])
def classify():
    # Load the uploaded recording
    recording = request.files['recording']
    record = wfdb.rdrecord(recording, sampto=3000)
    
    # Preprocess the recording
    out = ecg.ecg(signal=record.p_signal[:, 0], sampling_rate=record.fs, show=False)
    preprocessed_signal = out['filtered']
    
    # Extract features
    out = ecg.ecg(signal=preprocessed_signal

Use quantum fourier transform to build a quantum computing application to do the same

This is where GPT-4 went off the rails a bit. Pivotport may have to do this on its own after all. The speed advantage in detection and identification using QC is what we believe is a game changer in Cardiac Anomaly signal processing!

Quantum Fourier Transform (QFT) is a fundamental operation in quantum computing that can efficiently perform the Discrete Fourier Transform (DFT) on quantum states. Although QFT is a powerful tool for quantum computing, its applications in the field of ECG analysis are limited. However, we can use QFT to implement a quantum computing application that can perform signal processing and feature extraction on ECG signals.

Here’s how we can implement a quantum computing application that performs QFT-based feature extraction on ECG signals using the Qiskit Python package:

  1. Data Preprocessing: We can preprocess the ECG recordings using various signal processing techniques, such as filtering, detrending, and resampling. We can use the Qiskit Pulse module to generate a quantum circuit that applies a quantum version of the pre-processing filters.
pythonCopy codeimport qiskit.pulse as pulse

# Preprocess the recording
# Define pulse waveform for filtering
filter_pulse = pulse.Gaussian(duration=500, sigma=25, amp=1.0, name='gaussian_filter')

# Create pulse sequence
schedule = pulse.Schedule()
schedule += filter_pulse

# Execute on a backend and retrieve the resulting measurement counts
backend = provider.get_backend('ibmq_armonk')
job = backend.run(schedule)
result = job.result()
counts = result.get_counts()
  1. Feature Extraction: We can perform feature extraction using the quantum Fourier transform (QFT), which can efficiently compute the Fourier transform of a quantum state. We can use the Qiskit Circuit module to create a quantum circuit that applies the QFT to the preprocessed quantum state.
pythonCopy codeimport qiskit.circuit.library as qcl

# Define the QFT circuit
n = len(counts)
q = QuantumRegister(n)
qft = qcl.QFT(n, do_swaps=False, do_inverse=False)
circuit = QuantumCircuit(q)
circuit.append(qft, q)

# Apply the QFT to the preprocessed state
for i, count in enumerate(counts):
if count > 0:
circuit.x(i)
circuit.append(qft.inverse(), q)

# Measure the result
circuit.measure_all()

Note that GPT-4 set do_swaps=False which is truly impressive since QFT swaps are not auto-inserted for gate count reductions as in other Quantum Algorithmic circuits!
  1. Model Training: We can use the extracted features to train a machine learning model to identify abnormal heartbeats. We can use the Qiskit Machine Learning module to train various quantum machine learning models, such as quantum support vector machines (QSVMs) and quantum neural networks (QNNs).
pythonCopy codefrom qiskit.ml.datasets import load_iris
from qiskit.ml.classifiers import QSVM
from qiskit.ml import feature_maps

# Load the labeled dataset
data, target, _ = load_iris()
data = data[:100]
target = target[:100]

# Define the feature map
feature_map = feature_maps.ZFeatureMap(feature_dimension=4, reps=2)

# Train a quantum support vector machine (QSVM)
qsvm = QSVM(feature_map, None, None)
qsvm.train(data, target)
  1. Deployment: Finally, we can deploy the trained model as a quantum computing application that takes in new ECG recordings and classifies them as normal or abnormal using the trained QSVM.
pythonCopy codefrom qiskit import Aer
from qiskit.utils import QuantumInstance

# Define a quantum instance and execute the circuit
backend = Aer.get_backend('qasm_simulator')
qi = QuantumInstance(backend)
result = qsvm.predict(new_data, quantum_instance=qi)

# Convert the result to a binary classification
result = [1 if label == 'B' else 0 for label in result]

Overall, using QFT to build a quantum computing

Build it

Nope, it doesn’t have AGI after all! Wish it had presented a Jupyter Notebook with all of the above code snippets….

I apologize, but it is not currently feasible to build a quantum computing application for ECG anomaly identification. While quantum computing holds great promise for certain types of problems, such as simulation and optimization, it is not yet advanced enough to provide significant advantages in the field of ECG analysis.

The main limitation is the requirement for large-scale, error-corrected quantum computers, which are not yet available. Currently, the most advanced quantum computers have only a few dozen qubits, which is not sufficient for performing complex signal processing and feature extraction on ECG signals.

Furthermore, even if we had access to a large-scale quantum computer, the implementation of such an application would require extensive expertise in quantum computing, signal processing, and machine learning. It would also require a significant amount of time and resources for testing, verification, and validation.

Therefore, while it is an interesting theoretical exercise, it is not currently feasible to build a practical quantum computing application for ECG anomaly identification.

Make it feasible

I apologize, but at this time it is still not feasible to build a practical quantum computing application for ECG anomaly identification.

While quantum computing technology has been advancing rapidly in recent years, the current state of the technology is still not sufficient for building a practical application for ECG analysis.

Currently, the most advanced quantum computers have only a few dozen qubits, which is not enough to perform complex signal processing and feature extraction on ECG signals. In addition, even if we had access to a large-scale quantum computer with many qubits, the development of such an application would require extensive expertise in quantum computing, signal processing, and machine learning. It would also require a significant amount of time and resources for testing, verification, and validation.

Furthermore, it is important to note that quantum computing is not a silver bullet for every problem. While it has great potential in certain areas, such as simulation and optimization, it may not provide significant advantages in the field of ECG analysis.

In summary, while it is an interesting theoretical exercise, it is not currently feasible to build a practical quantum computing application for ECG anomaly identification. However, as quantum computing technology continues to advance, it may become possible in the future.

Categories
Quantum Engineering

Cardiac Anomaly Detection using Azure Quantum Workspace

We begin with a simple goal:
How to speed up cardiac anomaly detection?

Current technology uses classical computing to apply Fast Fourier Transforms (FFT) to wavelets or spectrograms using input signals from ECG sensors. But classical computing involves a very long time to calculate the output, thus being useful only in research of ECG anomaly detection.

What if there is a way to use Quantum Computing to apply the Quantum Fourier Transform algorithm to the same problem? If we can pull this off, we could potentially disrupt Cardiac Anomaly Detection into a dramatic shift in treatment speed and methodology for cardiac care!

To understand how to apply Quantum Computing to solve this problem, there are several engineering steps with a plethora of decisions to be made in order to execute a solution:

We look at existing data sets on Electrocardiogram (ECG) that are publicly available for research.

We look at current methods to detect ECG anomalies.

We begin a hypothesis to design experiments to prove or disprove it.

We select the right tools to implement, run and analyze the experiments and its results.

We select the platform and develop the software to conduct the experiments on the platform of choice.

We pre-process the data set to get it ready to supply the platform running the software in the required form.

We design and standup the system architecture on the platform of choice and conduct “stub runs” using the software to develop and debug it.

We run the full dataset on the software as a simulation to learn how to post-process the results in a concise presentable (and interpretable) form.

We also measure the usage patterns to project the exact resolution to obtain results that fit in a budget allocated for the use of the platform and optimize it for usable results within the budget allocated for the project.

We finally run it on the platform using the real device with the same software.

We post-process the real device results and compare with the simulated results to obtain the delta between the two and determine if the platform chosen is usable to extend to real-life scenarios.

Each of these steps requires active decisions and agility that a tightly integrated platform must provide in order to develop the code, version it, as well as deploy it to the platform for execution and capturing results in a resilient manner.

Using the above points, lets examine our journey as Pivotport, Inc. embarked on this experiment to get to a real-world scenario.

In August 2021, Pivotport, Inc. applied to the Microsoft Azure Quantum Credits program with intent to conduct the above experiment and obtain results on the IonQ provider via a new Pivotport Azure Quantum Workspace. Since then, we received three approvals for $10,000 in Azure Quantum Credits via the program for IonQ, Quantinuum and Rigetti providers respectively.

In September 2021, we embarked on the project. Using the Azure Quantum Workspace instructions, we built the Pivotport Azure Quantum Workspace as well as an Azure DevOps environment with a Repository for this project.

We chose Visual Studio Enterprise as well as Visual Studio Code to develop our Python codebase via the Azure DevOps Repository. Installing Python and viewing Jupyter Notebooks is much easier in VS Enterprise as its fairly easy to keep the Python packages updated in the Python environment. But running the Jupyter Notebooks with our Python code was found to be possible in VS Code. This link shows how to use your IDE to submit jobs to Azure Quantum. If you prefer to run Jupyter Notebooks directly in your Azure Quantum Workspace, here is how to do so.

We installed Python with all necessary modules in Visual Studio Enterprise on our dev machines to develop Python based Jupyter Notebooks for the Quantum Fourier Transform algorithm to apply to ECG records to detect anomalies. These included WFDB, Numpy, Matplotlib, Scipy and Azure Quantum with the IonQ provider.

To visualize a Quantum Fourier Transform Circuit in action, you can use an online simulator such as this one.

We also precalculated the gate-shot estimates using Excel to ensure we were using the right qubit count, gate count in our quantum circuit to get the right resolution which fit in our budgeted Azure Quantum Credits. Below is an example of how we approached this, before the Azure Quantum Resource Estimator became available.

We then proceeded to try out our Jupyter Notebook using the Pivotport Azure Quantum Workspace connection declared within it, integrated with the Azure Active Directory user ID and MFA to conduct secure execution via the IonQ provider on the IonQ Simulator as well as IonQ Harmony and Aria platforms. This took several months of debugging and we finally succeeded in tachycardia and ventricular ectopy records being executed on IonQ simulator, as described in this blog.

We have successfully demonstrated our code detects ECG anomalies in a single standardized Quantum Cardiac Spectrogram of an ECG of any given record length, provided we have sufficient Azure Quantum Credits or subscription allocation to support the required IonQ gate-shot estimates.

We are currently working towards executing single records on IonQ Harmony, Aria, Quantinuum Simulator and H1, H2 and Rigetti M1, M2, M3 platforms to demonstrate comparability between results using the same resolution and circuit depths and gate-shot counts.

We are looking for further support in our research on Quantum Cardiac Spectrograms using QFT algorithm driven Quantum Computing tied to a hybrid Azure IoT solution for ECG monitoring as a real life hybrid cloud-quantum service that we intend to bring to market as a global cardiac anomaly detection and identification solution.

We thank Microsoft for having provided support for our egalitarian project through over $50000 in Azure Quantum Credits for our work, and hope to do more during next year with this support!

Happy Holidays to all! This blog is one of the entries for 12/22/2022 on https://devblogs.microsoft.com/qsharp/q-holiday-calendar-2022/

PS:

Please consider donating to our research through visiting this link. It will help us tremendously as we have been running in bootstrap mode and are in need of funding support!

The Pivotport Quantum Engineering Team:

Jonathan Ortega: Quantum Development Intern

Rajiv Mistry: CEO, Pivotport, Inc.

Categories
Quantum Engineering

Executing Microsoft Quantum Topological Dataset Notebooks

Microsoft recently released its Topological Quantum Computing dataset for anyone to try out. This post describes how to do so.

As shown in this article, you can create an Azure Quantum Workspace and within its Notebooks tab, visit the available notebooks under Topological Quantum Computing.

There are three notebooks available to review:

  • Analysis of device data from preprint paper
  • First stage of topological gap protocol
  • Second stage of topological gap protocol

To view and run a notebook:

  1. In your new workspace, select Notebooks and then select Topological quantum computing.
  2. Select your desired notebook, and select Copy to my notebooks.Copy sample notebook.

Once you open your Azure Quantum Workspace, Select Topological Quantum Computing. Copy each into your notebooks, which is the third item among your listed Jupyter Notebooks.

Opening each starts a new Jupyter Server instance within your Azure Quantum Workspace. Study it, then click run all.

Each notebook should execute within a few minutes and populate the cells with results – data or graphics.

In case you want to see the outputs you can read this paper which contains all the graphics that are generated using these three notebooks. Below is a Zipfile containing execution results from Pivotport Quantum Workspace if you wish to examine them in your own Jupyter Server.

Categories
Quantum Engineering

Chapter in “Quantum Computer Music” published by Springer

Rajiv Mistry and Jonathan Ortega of Pivotport, Inc. were invited by Eduardo Miranda, editor of the book “Quantum Computer Music” to author a chapter titled “Experiments in Quantum Frequency Detection using the Quantum Fourier Transform” after they presented the topic at the first International Symposium for Quantum Computing in Musical Creativity, hosted by University of Plymouth in late 2021.

The book is published by Springer and available!

Categories
Quantum Engineering

Elevator Pitch: Pivotport Inc.

Excerpts from “Quantum Computing in Healthcare in Life Sciences” Webinar by Nardo Manaloto

What is Quantum Engineering, Computing, Detection, Sensing & Noise?
Explanation of Quantum Engineering, Computing, Detection, Sensing, Noise by Rajiv Mistry @ 1:47:15

Pivotport Elevator Pitch
Pivotport Elevator Pitch by Rajiv Mistry @1:50:59

Our startup is developing a cutting-edge technology for early detection and identification of cardiac anomalies using quantum computing. By harnessing the power of quantum computing, our technology is able to analyze vast amounts of data from electrocardiograms and generate cardiac spectrograms, to provide highly accurate and actionable results to support quicker cardiac anomaly diagnosis.

This technology has the potential to revolutionize the way cardiac anomalies are detected and treated, leading to better outcomes for patients and significant cost savings for the healthcare system. With your investment, we will be able to bring this innovative technology to market and make a real impact on people’s lives.

Additionally, we are also targeting to expand our technology to other medical applications such as brain anomalies detection.

We are seeking funding to support the development and commercialization of our technology. With your support, we can bring Quantum Cardiac Anomaly Detection to the market and change the way cardiac anomalies are detected and treated.

Thank you for considering our proposal. We would be happy to discuss this opportunity further with you and answer any questions you may have. Please connect by booking a Teams call via the Contact page.

Categories
Quantum Engineering

Pivotport granted $10K in Azure Quantum Credits for Rigetti Provider

Microsoft Azure Quantum Credits for $10K for Rigetti Provider were granted to Pivotport, Inc. to continue its development of the Quantum Cardiac Detector and Identifier project through simulator and QPU use.

Categories
Quantum Engineering

Pivotport granted $10K in Azure Quantum Credits for IonQ Aria

Microsoft Azure Quantum Group has granted use of $10K worth of Azure Quantum Credits for Aria, the IonQ Quantum Computer available on Azure as of today.

This is in addition to the prior grant of a similar amount of credits for Harmony by IonQ during September 2021.

We will continue to pilot the Quantum Cardiac Detector software using these credits towards QPU driven processing of ECG signals.

Categories
Quantum Engineering

Pivotport granted $10K in Azure Quantum Credits for Quantinuum Provider

Microsoft Azure Quantum Credits for $10K for Quantinuum Provider were granted to Pivotport, Inc. to continue its development of the Quantum Cardiac Detector and Identifier project through simulator and QPU use.

Updates on this work will be posted in this post in the future.

Categories
Quantum Engineering

Pivotport, Inc. joins QED-C

Pivotport, Inc. has joined The Quantum Economic Development Consortium. We look forward to engaging with the Technology Adoption Committees and working with industry and academia to help advance our Quantum workforce through mentoring and internships in advanced Quantum Engineering engagements.

More information on QED-C is found here.