数据湖探索 DLI-通用队列操作OBS表如何设置AK/SK:方案2:Spark Jar作业设置获取AK/SK

时间:2024-04-29 16:49:36

方案2:Spark Jar作业设置获取AK/SK

  • 获取结果为AK/SK时,设置如下:
    • 代码创建SparkContext
      val sc: SparkContext = new SparkContext()
      sc.hadoopConfiguration.set("fs.obs.access.key", ak)
      sc.hadoopConfiguration.set("fs.obs.secret.key", sk)
    • 代码创建SparkSession
      val sparkSession: SparkSession = SparkSession
            .builder()
            .config("spark.hadoop.fs.obs.access.key", ak)
            .config("spark.hadoop.fs.obs.secret.key", sk)
            .enableHiveSupport()
            .getOrCreate()
  • 获取结果为AK/SK和Securitytoken时,鉴权时,临时AK/SK和Securitytoken必须同时使用,设置如下:
    • 代码创建SparkContext
      val sc: SparkContext = new SparkContext()
      sc.hadoopConfiguration.set("fs.obs.access.key", ak)
      sc.hadoopConfiguration.set("fs.obs.secret.key", sk)
      sc.hadoopConfiguration.set("fs.obs.session.token", sts)
    • 代码创建SparkSession
      val sparkSession: SparkSession = SparkSession
            .builder()
            .config("spark.hadoop.fs.obs.access.key", ak)
            .config("spark.hadoop.fs.obs.secret.key", sk)
            .config("spark.hadoop.fs.obs.session.token", sts)
            .enableHiveSupport()
            .getOrCreate()
support.huaweicloud.com/dli_faq/dli_03_0017.html