spark
Here are 5,022 public repositories matching this topic...
-
Updated
Jul 24, 2020 - Python
-
Updated
Sep 6, 2020 - Go
-
Updated
Sep 14, 2020 - JavaScript
-
Updated
Sep 14, 2020 - Python
Is your feature request related to a problem? Please describe.
It would be great to be able to configure the route output, e.g. I want to hide the Cube.js server is running in production mode. Learn more about production mode.
when the base route is hit
Describe the solution you'd like
I think it would be great if one could configure the output of https://github.com/cube-js/cube.js/
-
Updated
Aug 11, 2020 - Java
-
Updated
Oct 31, 2019
-
Updated
Sep 14, 2020 - Java
-
Updated
Sep 9, 2020 - Java
-
Updated
Jun 20, 2020 - Python
-
Updated
Sep 14, 2020 - Jupyter Notebook
-
Updated
Sep 14, 2020 - Java
-
Updated
Apr 24, 2020 - Jsonnet
-
Updated
Sep 14, 2020 - Scala
-
Updated
Jul 27, 2020 - Python
-
Updated
May 26, 2019 - Scala
-
Updated
Jun 2, 2020 - JavaScript
Hi, if my spark app is using 2 storage type, both S3 and Azure Data Lake Store Gen2, could I put spark.delta.logStore.class=org.apache.spark.sql.delta.storage.AzureLogStore, org.apache.spark.sql.delta.storage.S3SingleDriverLogStore
Thanks in advance
-
Updated
Sep 11, 2020 - Jupyter Notebook
I have a simple regression task (using a LightGBMRegressor) where I want to penalize negative predictions more than positive ones. Is there a way to achieve this with the default regression LightGBM objectives (see https://lightgbm.readthedocs.io/en/latest/Parameters.html)? If not, is it somehow possible to define (many example for default LightGBM model) and pass a custom regression objective?
Problem
Since Java 8 was introduced there is no need to use Joda as it has been replaced the native Date-Time API.
Solution
Ideally greping and replacing the text should work (mostly)
Additional context
Need to check if de/serializing will still work.
Improve this page
Add a description, image, and links to the spark topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the spark topic, visit your repo's landing page and select "manage topics."
At this moment relu_layer op doesn't allow threshold configuration, and legacy RELU op allows that.
We should add configuration option to relu_layer.