连续张量理解和contiguous()方法使用,view和reshape的区别 待办 内存共享: 下边的x内存布局是从0开始的,y内存布局,不是从0开始的张量 For example: when you call transpose(), PyTorch doesn't generate new tensor with new layout, it just modifies meta information in Tensor object so offset and stride are f…