使用特性
- 将下面参数配置添加到spark-defaults.conf或自定义文件中使spark-sql执行时可以加载到。
1 2 3 4 5 6 7 8 9 10 11 12 13
spark.sql.cbo.enabled true spark.sql.cbo.planStats.enabled true spark.sql.ndp.enabled true spark.sql.ndp.filter.selectivity.enable true spark.sql.ndp.filter.selectivity 0.5 spark.sql.ndp.alive.omnidata 3 spark.sql.ndp.table.size.threshold 10 spark.sql.ndp.zookeeper.address agent1:2181,agent2:2181,agent3:2181 spark.sql.ndp.zookeeper.path /sdi/status spark.sql.ndp.zookeeper.timeout 15000 spark.driver.extraLibraryPath /home/omm/omnidata-install/haf-host/lib spark.executor.extraLibraryPath /home/omm/omnidata-install/haf-host/lib spark.executorEnv.HAF_CONFIG_PATH /home/omm/omnidata-install/haf-host/etc/
- 在spark-sql命令中加入启动参数--driver-class-path '/opt/boostkit/' --jars '/opt/boostkit/' --conf 'spark.executor.extraClassPath=./*' 如下所示。
1
$SPARK_HOME/bin/spark-sql --driver-class-path '/opt/boostkit/*' --jars '/opt/boostkit/*' --conf 'spark.executor.extraClassPath=./*' --name tpch_query6.sql --driver-memory 50G --driver-java-options -Dlog4j.configuration=file:/usr/local/spark/conf/log4j.properties --executor-memory 32G --num-executors 30 --executor-cores 18
- 配置文件的修改和详细的特性使用流程请参见《OmniRuntime 特性指南》。
父主题: OmniRuntime 算子下推特性