检测到您已登录华为云国际站账号,为了您更好的体验,建议您访问国际站服务网站 https://www.huaweicloud.com/intl/zh-cn
不再显示此消息
jackson-databind-<version>.jar jackson-core-<version>.jar jackson-annotations-<version>.jar re2j-<version>.jar jaeger-core-<version>.jar opentracing-api
d1.s1"; static final String ROOT_SG1_D1 = "root.sg1.d1"; public static void main(String[] args) throws Exception { // use session api
Start to access REST API. Warning: Permanently added '192.168.234.117' (ED25519) to the list of known hosts.
Java bin/flink run --class com.huawei.bigdata.flink.examples.FlinkProcessingTimeAPIMain /opt/client/FlinkCheckpointJavaExample.jar --chkPath
com.huawei.bigdata.spark.examples.SparkHivetoHbase --master yarn --deploy-mode client /opt/female/SparkHivetoHbase-1.0.jar 运行Python样例程序 由于pyspark不提供Hbase相关api
com.huawei.bigdata.spark.examples.SparkHbasetoHbase --master yarn --deploy-mode client /opt/female/SparkHbasetoHbase-1.0.jar 运行Python样例程序 由于pyspark不提供Hbase相关api
HttpChannelState@38be31e{s=IDLE rs=COMPLETED os=COMPLETED is=IDLE awp=false se=false i=false al=0},r=1,c=true/true,a=IDLE,uri=https://10.244.224.65:21495/api
mm:ss,SSS>|<Log Level>|<产生该日志的线程名字>|<log中的message>|<日志事件的发生位置> 2020-04-29 20:09:28,543 | INFO | http-bio-21401-exec-56 | Request comes from API
generate_keystore.sh”脚本不会配置“security.ssl.key-password”、“security.ssl.keystore-password”和“security.ssl.truststore-password”的值,需要使用Manager明文加密API
安全模式) curl --location-trusted -u doris用户名:用户密码 -H "label:table1_20230217" -H "column_separator:," -T data.csv https://Doris FE实例IP地址:HTTPS端口/api
com.huawei.bigdata.spark.examples.SparkHivetoHbase --master yarn --deploy-mode client /opt/female/SparkHivetoHbase-1.0.jar 运行Python样例程序 由于pyspark不提供Hbase相关api
进行获取,执行curl -k -i -u user name:password -X POST -HContent-type:application/json -d '{"plainText":"password"}' 'https://x.x.x.x:28443/web/api/
com.huawei.bigdata.spark.examples.SparkHivetoHbase --master yarn --deploy-mode client /opt/female/SparkHivetoHbase-1.0.jar 运行Python样例程序 由于pyspark不提供Hbase相关api
final boolean asyncEnable = false; Producer producerThread = new Producer(KafkaProperties.TOPIC, asyncEnable); } Kafka生产者代码可参考使用Producer API
INTO udfSink SELECT a, udaf(a) FROM udfSource group by a; UDTF java代码及SQL样例 UDTF java使用样例 package com.xxx.udf; import org.apache.flink.api.java.tuple.Tuple2
hbase_sample_table',{FILTER=>"SingleColumnValueFilter(family,qualifier,compareOp,comparator,filterIfMissing,latestVersionOnly)"} hbase shell下面做复杂的查询请使用API
hbase_sample_table',{FILTER=>"SingleColumnValueFilter(family,qualifier,compareOp,comparator,filterIfMissing,latestVersionOnly)"} hbase shell下面做复杂的查询请使用API
hbase_sample_table',{FILTER=>"SingleColumnValueFilter(family,qualifier,compareOp,comparator,filterIfMissing,latestVersionOnly)"} hbase shell下面做复杂的查询请使用API
<产生该日志的线程名字>|<log中的message>|<日志事件的发生位置> 2020-01-19 16:05:18,589 | INFO | regionserver16020-SendThread(linux-k6da:2181) | Client will use GSSAPI
hbase_sample_table',{FILTER=>"SingleColumnValueFilter(family,qualifier,compareOp,comparator,filterIfMissing,latestVersionOnly)"} hbase shell下面做复杂的查询请使用API