Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support clamp() with tensor min and max #2793

Closed
zuoxingdong opened this issue Sep 19, 2017 · 10 comments
Closed

Support clamp() with tensor min and max #2793

zuoxingdong opened this issue Sep 19, 2017 · 10 comments
Assignees
Labels
function request A request for a new function or the addition of new arguments/modes to an existing function. good first issue high priority module: numpy Related to numpy support, and also numpy compatibility of our operators triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@zuoxingdong
Copy link
Contributor

zuoxingdong commented Sep 19, 2017

It might be useful to extend the current clamp() function, where the min and max can be tensor.

e.g. To clip the value for a [5, 3]-tensor, we could define min and max for each element, or each row/column.

Finally, it might be good to rename it to clip to be consistent with numpy.clip

cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @mruberry @rgommers @heitorschueroff

@soumith soumith added this to nn / autograd / torch in Issue Categories Sep 20, 2017
@soumith soumith moved this from nn / autograd / torch to torch /autograd in Issue Categories Sep 20, 2017
@nishnik
Copy link

nishnik commented Nov 14, 2017

Hey @zuoxingdong, how to approach this.
I would like to contribute

@GuillaumeLeclerc
Copy link

I think it would ideed be a good thing. I did not find a better way than: torch.stack and then a clamp

@yaceben
Copy link

yaceben commented Oct 11, 2018

any update on this ?

in the meantime I guess that this is a possible alternative...

clipped = torch.max(torch.min(x, max), min)

@Chillee Chillee self-assigned this Jul 25, 2019
@izdeby izdeby added enhancement Not as big of a feature, but technically not a bug. Should be easy to fix triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Jul 26, 2019
@adam-dziedzic
Copy link

@yaceben thanks for the one-liner

@dniku
Copy link

dniku commented Oct 24, 2019

One could also use torch.where, similar to the way suggested in this thread.

@VitalyFedyunin VitalyFedyunin added good first issue module: bootcamp We plan to do a full writeup on the issue, and then get someone to do it for onboarding labels Dec 11, 2019
@devpouya
Copy link

Any progress here?

@arpanmukherjee
Copy link

I can take up this issue if anyone hasn't.

@mruberry mruberry added function request A request for a new function or the addition of new arguments/modes to an existing function. and removed enhancement Not as big of a feature, but technically not a bug. Should be easy to fix module: bootcamp We plan to do a full writeup on the issue, and then get someone to do it for onboarding labels Jan 10, 2021
@mruberry mruberry added the module: numpy Related to numpy support, and also numpy compatibility of our operators label Jan 10, 2021
@mruberry
Copy link
Collaborator

I've updated the labels on this function. We would still accept a PR implementing this behavior. Note that this functionality is consistent with NumPy, where np.clip can accept array-likes.

@mruberry mruberry removed triage review triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Feb 11, 2021
@mruberry
Copy link
Collaborator

Updating to high priority based on user activity.

@albanD albanD added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Feb 11, 2021
@ssgosh
Copy link

ssgosh commented Apr 14, 2021

This feature will be very useful for clipping multi-channel images with different ranges for each channel.

krshrimali pushed a commit to krshrimali/pytorch that referenced this issue May 19, 2021
Summary:
Fixes pytorchgh-2793

Pull Request resolved: pytorch#52695

Reviewed By: mruberry

Differential Revision: D27395977

Pulled By: ezyang

fbshipit-source-id: f86aa240feb034d42e4c45447e72218f6a773c24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
function request A request for a new function or the addition of new arguments/modes to an existing function. good first issue high priority module: numpy Related to numpy support, and also numpy compatibility of our operators triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
Issue Categories
torch /autograd