python opencv 去水印实现

主要利用了 opencv 的 inpaint 方法,实现类似蒙版的效果



string to array
def get_opencv_img_from_string(data):
    arr = np.array(bytearray(data), dtype=np.uint8)
    img = cv2.imdecode(arr, -1)
    return img

array to string
def get_buffer_from_bytes_array(data):
    arr = cv2.imencode(".jpg", data)[1].tostring()
    return arr

def watermark_clean(data, platform):
    img = get_opencv_img_from_string(data)
    height = img.shape[0]
    width = img.shape[1]
    rects = ((width - 169, height - 22, width, height),(1, 1, 164, 21)) #水印区域
    cv2.imshow('a', img)
    mask = np.zeros(img.shape[:2], np.uint8)
    for rect in rects:
        x1, y1, x2, y2 = rect
        cv2.rectangle(mask, (x1, y1), (x2, y2), (255, 255, 255), -1)
        img = cv2.inpaint(img, mask, 10.0, cv2.INPAINT_TELEA) #蒙版
    return get_buffer_from_bytes_array(img)

def show_img(img_arr):
    cv2.imshow('test', img_arr)

Installing OpenCV 2.4.* with Python Support

Opencv的安装简直让想在图像处理方面试试水的新手抓耳挠腮,今天记录下心酸的安装过程,避免 fall in a same hole.


cd ~/
git clone
git clone


cd ~/opencv
mkdir build
cd build



(这部如果遇到文件 ippicv_linux_20151201.tgz 下载失败的情况,可以手动下载至 opencv/3rdpart/ippiv/downloads/liniux_*/ 下 再作cmake)




sudo make install


sudo ln -s /usr/local/lib/python2.7/site-packages/ {your_python_path}/
sudo ln -s /usr/local/lib/python2.7/site-packages/ {your_python_path}/

Android Studio + OpenCV

The below steps for using Android OpenCV sdk in Android Studio. This is a simplified version of this(1) SO answer.

  1. Download latest OpenCV sdk for Android from and decompress the zip file.
  2. Import OpenCV to Android Studio, From File -> New -> Import Module, choose sdk/javafolder in the unzipped opencv archive.
  3. Update build.gradle under imported OpenCV module to update 4 fields to match your project build.gradle a) compileSdkVersion b) buildToolsVersion c) minSdkVersion and d) targetSdkVersion.
  4. Add module dependency by Application -> Module Settings, and select the Dependenciestab. Click + icon at bottom, choose Module Dependency and select the imported OpenCV module.
    • For Android Studio v1.2.2, to access to Module Settings : in the project view, right-click the dependent module -> Open Module Settings
  5. Copy libs folder under sdk/native to Android Studio under app/src/main.
  6. In Android Studio, rename the copied libs directory to jniLibs and we are done.

Step (6) is since Android studio expects native libs in app/src/main/jniLibs instead of older libs folder. For those new to Android OpenCV, don’t miss below steps

  • include static{ System.loadLibrary("opencv_java"); } or static{ OpenCVLoader.initDebug(); } (Note: for OpenCV version 3 at this step you should instead load the library opencv_java3.)
  • For step(5), if you ignore any platform libs like x86, make sure your device/emulator is not on that platform.

OpenCV written is in C/C++. Java wrappers are

  1. Android OpenCV SDK – maintained Android Java wrapper. I suggest this one.
  2. OpenCV Java – maintained auto generated desktop Java wrapper.
  3. JavaCV – Popular Java wrapper maintained by independent developer(s). Not Android specific. This library might get out of sync with OpenCV newer versions.

Apache Kylin使用

部署或建cube出问题大部分都是环境问题或hadoop hbase 版本问题


  • Hadoop: 2.4 – 2.7
  • Hive: 0.13 – 0.14
  • HBase: 0.98 – 0.99
  • JDK: 1.7+

其中略坑的是 hbase的0.9*版本是不支持Hadoop 2.7,的,若hadoop是2.7.*,需要部署hbase 1.* ,对于hbase 1.* 版本需要下载单独编译的kylin二进制包

Binary Package (for running on HBase 1.1.3 or above)

创建Cube时执行 job 出错

native snappy library not available: SnappyCompressor has not been loaded.

原因是hadoop native lib少了snappy解压缩库

sudo yum install snappy snappy-devel
sudo ln -s /usr/lib64/ $HADOOP_HOME/lib/native/

在 $HADOOP_HOME/etc/hadoop/ 增加

export JAVA_LIBRARY_PATH="/usr/local/hadoop/lib/native"




2016-02-22 16:24:16,740 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hive.hcatalog.mapreduce.HCatInputFormat not found
	at org.apache.hadoop.conf.Configuration.getClass(
	at org.apache.hadoop.mapreduce.task.JobContextImpl.getInputFormatClass(
	at org.apache.hadoop.mapred.MapTask.runNewMapper(
	at org.apache.hadoop.mapred.YarnChild$
	at Method)
	at org.apache.hadoop.mapred.YarnChild.main(
Caused by: java.lang.ClassNotFoundException: Class org.apache.hive.hcatalog.mapreduce.HCatInputFormat not found
	at org.apache.hadoop.conf.Configuration.getClassByName(
	at org.apache.hadoop.conf.Configuration.getClass(
	... 8 more


The issue here is Kylin assumes the same Hive jars on all Hadoop nodes. And when certain node missing the Hive jars (or even in different location), you get the ClassNotFoundException on HCatInputFormat.

Btw, you should be able to get a clear error message from Yarn job console. This is a met issue.

Deploying Hive to all cluster nodes can surely fix the problem, like you have tried.

Or another (cleaner) workaround is manually configure Kylin to submit Hive jars as additional job dependencies. See

Finally there's also a open JIRA suggests that Kylin should submit Hive jars by default. See


org.apache.kylin.job.exception.ExecuteException: org.apache.kylin.job.exception.ExecuteException: java.lang.NoSuchMethodError: org.apache.hadoop.yarn.conf.YarnConfiguration.getServiceAddressConfKeys(Lorg/apache/hadoop/conf/Configuration;)Ljava/util/List;
        at org.apache.kylin.job.execution.AbstractExecutable.execute(
        at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
        at java.util.concurrent.ThreadPoolExecutor.runWorker(
        at java.util.concurrent.ThreadPoolExecutor$
Caused by: org.apache.kylin.job.exception.ExecuteException: java.lang.NoSuchMethodError: org.apache.hadoop.yarn.conf.YarnConfiguration.getServiceAddressConfKeys(Lorg/apache/hadoop/conf/Configuration;)Ljava/util/List;
        at org.apache.kylin.job.execution.AbstractExecutable.execute(
        at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(
        at org.apache.kylin.job.execution.AbstractExecutable.execute(
        ... 4 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.yarn.conf.YarnConfiguration.getServiceAddressConfKeys(Lorg/apache/hadoop/conf/Configuration;)Ljava/util/List;
        at org.apache.hadoop.yarn.conf.HAUtil.getConfKeyForRMInstance(
        at org.apache.hadoop.yarn.conf.HAUtil.getConfValueForRMInstance(
        at org.apache.hadoop.yarn.conf.HAUtil.getConfValueForRMInstance(
        at org.apache.kylin.job.common.MapReduceExecutable.getRestStatusCheckUrl(
        at org.apache.kylin.job.common.MapReduceExecutable.doWork(
        at org.apache.kylin.job.execution.AbstractExecutable.execute(
        ... 6 more

拷贝新版本的hadoop-yarn-api.*.jar 至hbase/lib  hbase版本不对


kylin 报错: org.apache.hadoop.hbase.TableNotFoundException: Table KYLIN_* is not currently available.
           Load HFile to HBase Table failed

查看hbase log,看了下是snappy的问题,把kylin的压缩方式改为gzip,重启好了,可以顺利建cube了

5.提示 kylin_metadata 已存在 不能在hbase中创建表,然而hbase中又看不到该表

删除zookeeper 上hbase下该表node


rmr /hbase-unsecure/table/kylin_metadata