How to recover data (table level) using mysql bin log file (mysqlbinlog)

Hello All,

Today we faced an issue that was very critical. Yes, we lost one table in production server and the bad scenario is that due to near realtime & live replica, our secondary server was also corrupted. To worse, our snapshots were not enough to restore db to what we expected. One good thing we had is that, since we had live replica enabled with MySQL Bin Log strategy, I had full set of bin log files in production and its not so easy to directly use binary log files to restore a specific table. (Note: What is MySQL binary log:: http://dev.mysql.com/doc/refman/5.0/en/binary-log.html)

Steps to solve this issue:

1. Copy MySQL Binary log files to another instance

2. Decode MySQL Binary log files to readable & executable statements

3. Grep / Awk to filter for one specific tables and output your sql statements to one single file and append them

4. Create java / any language based program to parse entire sql and find out from when you might need the data and execute only those

NOTE: BEFORE WE START THIS, KINDLY TAKE A SNAPSHOT OF PRODUCTION AS A GOOD PRACTISE. 

1. Copy MySQL Binary log files to another instance

Locate binary log files by carefully checking /etc/my.ini file. And then copy files to other server, this is very crucial step as the binary log files will have database and schema information embedded as part of the sql statements decoded. If you try to run the query by accident in production, that might be a catastrophic event of you. Be very careful and first transfer to some other instance.

2. Decode MySQL binary log files to readable & executable statements

Before going in to execution of MySQL Binary Log by using ‘mysqlbinlog’ command kindly refer this page http://dev.mysql.com/doc/refman/5.0/en/mysqlbinlog.html and then use appropriate arguments. In my case, I needed to filter data from beginning till June 27 th 2014 10:00 PM and hence my command was

>mysqlbinlog --stop-datetime="2014-06-29 22:00:00" mysql-bin.000001 --base64-output=decode-rows --verbose > decoded_after.sql
>mysqlbinlog --stop-datetime="2014-06-29 22:00:00" mysql-bin.000002 --base64-output=decode-rows --verbose >> decoded_after.sql
.....
>mysqlbinlog --stop-datetime="2014-06-29 22:00:00" mysql-bin.0000n --base64-output=decode-rows --verbose >> decoded_after.sql (where n is last file number)

Once all this is done, keep a copy of this decoded_after.sql file and use it further (This is just in case if our further steps corrupts the file, we shall continue from this step)

3. Grep / Awk to filter for one specific tables and output your sql statements to one single file and append them

Now this is very important and efficient step where we have to filter insert, update & delete statement of one specific table

# grep -e '' -B 6 decoded_after > a

should be very important and use right keyword to filter specific table based queries. This will execute in blazing speed when comported to any other methods. Kindly do this to get decoded and sql statements that corresponds to one specific table / criteria.

4. Create java / any language based program to parse entire sql and find out from when you might need the data and execute only those

In our case, we wrote a java code to return a result that filtered all the data (based on our specific case). In our case, we had to get a specific value of the table and revert it back to the same, because we ran a query and that nullified very important identifier and to get it back, we had to look for whatever id was there in specific row before delete statement was executed. Either from insert or from update. This way, we precisely were able to get the data and restored it safely in temporary instance and after several sanity checks, we did same in production.

This saved our day and we were out of this trouble with a perfect solution. I wanted to share this here as I didn’t find any clear steps for recovering specific table using mysqlbinlog command. I will be more than glad to help anyone in case of same issue.

Thanks,

Kousik Rajendran.

 

AAYUWIZ with Remote Patient Monitoring + Smart Rx + Labs + Referral Module

AAYUWIZ is a product of Sphata Systems. We are shaping up AAYUWIZ with added features to really become one single platform where it shall provide the entire range of solution for any healthcare units like 1. Private Providers, 2. Providers, 3. Labs, 4. Pharmacy, 5. Clinic, 6. Hospitals and more.

Here is the list of features we have added to Aayuwiz this year.

1. Remote Patient Monitoring

Remote Patient Monitoring is the module where we are trying to address the problem of continued care after diagnosis / visit. Most of the providers today wanted to monitor the patient continuously after the care. For that, we are devising our own embedded hardware that shall display using HDMI and collects data from various biometric sensors. Initially we are addressing sensors like

  1. Glucometer
  2. Pulse Meter
  3. Temperature
  4. Blood Pressure
  5. ECG
  6. EMG

and more.. The device connects to TV and IR module where it will collect data from the patient at house / home care / nursing home or any place of travel. The device again is thought to have a simple navigation like a TV channel where we will provide that as something like APPLE TV or any other micro computer like device.

The collected data then will be pushed through GSM / 3G / Internet to AAYUWIZ server. Then, Aayuwiz server will process the data applying all rules and then shall intelligently remind patient / doctors if there is any deviation from the norms. If its severe abnormality than usual range of value, the app immediately schedules appointment with doctors, facilitates doctors and patient to do video calls using our video calling facility in Aayuwiz.

This way, we collect data from patients at very much affordable price (pricing yet to be announced) for the device and then to connect the gap in continued care for a patient.

2. Smart Rx – Smart Medication Prescription Module

We call it Smart Rx because, the Rx prescription module of AAYUWIZ has few important features as follows

  1. Brand / Generic Medicine search & prescribe
  2. Preset of default Frequency & Duration for each medicine
  3. Detailed Drug information with Pricing
  4. Drug Allergy notification

As listed,  the prescription module has added with feature of Generic Medicine mapping along with its dosage value which allows the medicines to be searched either by brand name or by generic name. Each medicine shall be added with default options of frequency (say 1-0-1 a day) & duration (2 days) like that. This helps doctors to save time spent over system. Also, when searching the medicine, the detailed information is shown in a pane clearly to give doctor along with indication & contraindication to facilitate doctor to prescribe right medicine. Other important feature is Drug Allergy Notification where if a medicine has marked as allergic (in AAYUWIZ), the notifier alerts doctor if prescribed again.

3. Labs

Labs is another module that has been re-defined with lots of feature. One important thing that I would like to highlight today is networked diagnostic centers. This is a feature where the doctor after selecting labs to be tested, he shall also select a diagnostic center of his favorite or internally associated labs (with clinic or hospital) or even external diagnostic center. Also, the lab user shall login to AAYUWIZ where the user will be notified and he shall directly click on message and enter the lab data. The data will be added to patient record and again will be notified to doctors. That way, the whole lab system is networked.

4. Referral Module

Today, we are seeing the doctors are well-connected. They are referring their patients to other doctors in network based on specialty or critical of care. Here in Aayuwiz, we have made it a simple workflow where the doctor shall search a doctor in his network or hospital or by area and then refer the patient to his / her doctor. Once referred, the other doctor will get the access to patients record along with the care note given by referrer.

5. Search & Schedule

We have an external application as well as internal module where patient can go online, search for doctors for any given speciality, location or so and then shall schedule. Once done, the doctor will get a notification and shall be able to confirm the appointment. That app is also integrated together with scheduler module of AAYUWIZ.

Quick Demo Video:

Ultimately, what we are trying to address is some of the common issue that we see in connecting various solutions together under one platform. That is AAYUWIZ!!!  Expect more posts in near future with updated and added modules…

Loading Maven dependencies for OSGi Bundles – Apache Solr / Lucene in Apache Karaf

Hi everyone,
Today its a great learning for me and my team trying to build Apache Solr within Apache Karaf. I am gonna talk about missing dependencies issue related with Karaf using Felix bundling tool in maven. This is quite common and I took example of Solr for here to explain you all.

What is Apache Karaf?
Apache Karaf is a small OSGi based runtime which provides a lightweight container onto which various components and applications can be deployed.

What is Apache Solr?
Solr is the popular, blazing fast open source enterprise search platform from the Apache LuceneTM project. Its major features include powerful full-text search, hit highlighting, faceted search, near real-time indexing, dynamic clustering, database integration, rich document (e.g., Word, PDF) handling, and geospatial search.

Creating Client Project using SolrJ
Solrj is a java client to access solr. It offers a java interface to add, update, and query the solr index. For adding this feature to my karaf container, we created a pom.xml with minimal dependency of

Problem
<!-- SOLR CORE AND ITS DEPENDENCIES -->
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>4.3.1</version>
</dependency>

Then when deploying in Karaf it showed some error when you read headers
karaf@root> headers <START_ID>

Reason
It will show you some jars and dependencies highlighted in red which means that we failed to deploy them in maven pom.xml. For this, we tried including all the dependencies that were missing and its like a fission effect where the list of missing dependencies started growing more and more. For this, there is a simple approach that will eliminate the missing dependency issue. Here is the sample of felix definition in our pom.xml

Solution
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<executions>
<execution>
<id>bundle</id>
<phase>package</phase>
<goals>
<goal>bundle</goal>
</goals>
</execution>
</executions>
<configuration>
<instructions>
<Bundle-Name>BUNDLE NAME</Bundle-Name>
<Bundle-Description>This bundle will be used as a common registry
service across the platform</Bundle-Description>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Import-Package>${bundle.import.package}</Import-Package>
<Export-Package>${bundle.export.package}</Export-Package>
true</Embed-Transitive>
*;scope=compile|runtime;</Embed-Dependency>
<DynamicImport-Package>${bundle.dynamicImport.package}</DynamicImport-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>

Having the property defined in real world as


<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<bundle.import.package>com.**.**,*</bundle.import.package>
<bundle.private.package>com.**.**</bundle.private.package>
<bundle.export.package></bundle.export.package>
<bundle.dynamicImport.package>
org.apache.log,
all.missing.dependencies
</bundle.dynamicImport.package>
</properties>

Sample
Now we need to replace all.missing.dependencies with all the jars that are listed as red in karaf. In my case for Apache Solr, I added the below given list

<bundle.dynamicImport.package$gt;
com.sun.jdmk.comm,
com.sun.management,
com.sun.msv.grammar,
com.sun.msv.grammar.trex,
com.sun.msv.reader,
com.sun.msv.reader.trex.ng,
com.sun.msv.reader.util,
com.sun.msv.verifier,
com.sun.msv.verifier.regexp,
com.sun.msv.util,
com.vividsolutions.jts.algorithm,
com.vividsolutions.jts.geom,
com.vividsolutions.jts.io,
com.vividsolutions.jts.operation.union,
com.vividsolutions.jts.operation.valid,
com.vividsolutions.jts.simplify,
com.vividsolutions.jts.util,
javax.jms,
javax.portlet,
org.apache.log,
org.jboss.netty.bootstrap,
org.jboss.netty.buffer,
org.jboss.netty.channel,
org.jboss.netty.channel.group,
org.jboss.netty.channel.socket.nio,
org.relaxng.datatype,
org.relaxing.datatype,
sun.misc
</bundle.dynamicImport.package$gt;

Reasoning
The magic here that we added all the missing / unavailable dependencies as dynamic import to the Felix builder, that will create a manifest in the jar keeping them in dynamic import section. This will tell Apache Karaf container that the jars / packages are not needed to be part of bundle. This way, you shall eliminate all the missing dependency issue with the classes that need not be bundled along with the core jars.

Here is my Full POM file
<?xml version="1.0"?>
<project
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.kousik.package</groupId>
<artifactId>yourartifact</artifactId>
<version>0.5</version>
</parent>
<groupId>com.package</groupId>
<artifactId>artifact-impl</artifactId>
<version>0.5</version>
<name>artifact-impl</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<bundle.import.package>com.package.api,*</bundle.import.package>
<bundle.private.package>com.package.impl.ClassImpl</bundle.private.package>
<bundle.export.package></bundle.export.package>
<bundle.dynamicImport.package>
com.sun.jdmk.comm,
com.sun.management,
com.sun.msv.grammar,
com.sun.msv.grammar.trex,
com.sun.msv.reader,
com.sun.msv.reader.trex.ng,
com.sun.msv.reader.util,
com.sun.msv.verifier,
com.sun.msv.verifier.regexp,
com.sun.msv.util,
com.vividsolutions.jts.algorithm,
com.vividsolutions.jts.geom,
com.vividsolutions.jts.io,
com.vividsolutions.jts.operation.union,
com.vividsolutions.jts.operation.valid,
com.vividsolutions.jts.simplify,
com.vividsolutions.jts.util,
javax.jms,
javax.portlet,
org.apache.log,
org.jboss.netty.bootstrap,
org.jboss.netty.buffer,
org.jboss.netty.channel,
org.jboss.netty.channel.group,
org.jboss.netty.channel.socket.nio,
org.relaxng.datatype,
org.relaxing.datatype,
sun.misc
</bundle.dynamicImport.package>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.package.api</groupId>
<artifactId>search-api</artifactId>
<version>0.5</version>
</dependency>
<!-- SOLR CORE AND ITS DEPENDENCIES -->
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.5.6</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-sandbox</artifactId>
<version>4.0.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<executions>
<execution>
<id>bundle</id>
<phase>package</phase>
<goals>
<goal>bundle</goal>
</goals>
</execution>
</executions>
<configuration>
<instructions>
<Bundle-Name>Project Impl</Bundle-Name>
<Bundle-Description>This bundle will be used as a common registry
service across the platform</Bundle-Description>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Import-Package>${bundle.import.package}</Import-Package>
<Export-Package>${bundle.export.package}</Export-Package>
true</Embed-Transitive>
*;scope=compile|runtime;</Embed-Dependency>
<DynamicImport-Package>${bundle.dynamicImport.package}</DynamicImport-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
<repositories>
</repositories>
</project>

Also you shall download from here pom.xml
Now that we know the technique, we started usign the same approach for other services too.

Thanks,
Kousik Rajendran.