Skip to main content
Note, when setting the `frame` of `canvas`, you should use the coordinate system of `view` (namely, `view.bounds`, not `view.frame`; the latter is in the coordinate system of the `superview` of `view`
Source Link
Rob
  • 2.7k
  • 17
  • 27
class VideoEditorViewController: UIViewController {
    private let viewModel = VideoEditorViewModel()
    private let canvas = CanvasView()
    private let picker = UIColorPickerViewController()

    override func viewDidLoad() {
        super.viewDidLoad()
        setCanvasUI()
    }
}

// MARK: - Private interface

private extension VideoEditorViewController {
    func setCanvasUI() {
        view.addSubview(canvas)
        canvas.backgroundColor = .clear

        bindViewModelToUpdateCanvas()
        bindCanvasToUpdateViewModel()
        addButtons()
    }

    func bindViewModelToUpdateCanvas() {
        viewModel.onUpdateDrawing = { [weak self] lines in
            self?.canvas.lines = lines
        }
    }

    func bindCanvasToUpdateViewModel() {
        canvas.onStartGesture = { [weak self] point in
            self?.viewModel.startNewLine(at: point)
        }

        canvas.onContinueGesture = { [weak self] points in
            self?.viewModel.appendToLastLine(points)
        }
    }

    func addButtons() {
        let size = CGSize(width: 50, height: 50)

        let undoButton = CanvasButton(size: size, image: UIImage(systemName: "arrowshape.turn.up.left.fill")) { [weak self] _ in
            self?.viewModel.undo()
        }

        let colorPicker = CanvasButton(size: size, image: UIImage(systemName: "paintpalette.fill")) { [weak self] _ in
            guard let self else { return }
            picker.delegate = self
            present(picker, animated: true)
        }

        let trashCanButton = CanvasButton(size: size, image: UIImage(systemName: "trash")) { [weak self] _ in
            self?.viewModel.undoAll()
        }

        let uploadViewButton = CanvasButton(size: size, image: UIImage(systemName: "envelope")) { _ in // [weak self]
            print("upload not implemented")
        }

        let toggleDrawing = CanvasButton(size: size, image: UIImage(systemName: "pencil.circle")) { [weak self] _ in
            self?.viewModel.isDrawable.toggle()
        }

        let stackView = UIStackView(arrangedSubviews: [
            undoButton,
            trashCanButton,
            colorPicker,
            uploadViewButton,
            toggleDrawing
        ])
        view.addSubview(stackView)

        stackView.axis = .vertical
        stackView.bringSubviewToFront(view)
        stackView.spacing = 30
        stackView.snp.makeConstraints { make in
            make.right.equalTo(view.snp_rightMargin).offset(-20)
            make.top.equalTo(view.snp_topMargin)
        }
        canvas.frame = view.framebounds
    }
}

// MARK: - UIColorPickerViewControllerDelegate

extension VideoEditorViewController: UIColorPickerViewControllerDelegate {
    func colorPickerViewController(_ viewController: UIColorPickerViewController, didSelect color: UIColor, continuously: Bool) {
        viewModel.strokeColor = picker.selectedColor
    }
}
  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Likewise, when setting the frame of canvas, you should use the coordinate system of view. Specifically, you should reference view.bounds, not view.frame; the latter is in the coordinate system of the superview of view. In this case, there is not an observable difference, but if this view was not at (0,0), you will start to manifest problems.

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

class VideoEditorViewController: UIViewController {
    private let viewModel = VideoEditorViewModel()
    private let canvas = CanvasView()
    private let picker = UIColorPickerViewController()

    override func viewDidLoad() {
        super.viewDidLoad()
        setCanvasUI()
    }
}

// MARK: - Private interface

private extension VideoEditorViewController {
    func setCanvasUI() {
        view.addSubview(canvas)
        canvas.backgroundColor = .clear

        bindViewModelToUpdateCanvas()
        bindCanvasToUpdateViewModel()
        addButtons()
    }

    func bindViewModelToUpdateCanvas() {
        viewModel.onUpdateDrawing = { [weak self] lines in
            self?.canvas.lines = lines
        }
    }

    func bindCanvasToUpdateViewModel() {
        canvas.onStartGesture = { [weak self] point in
            self?.viewModel.startNewLine(at: point)
        }

        canvas.onContinueGesture = { [weak self] points in
            self?.viewModel.appendToLastLine(points)
        }
    }

    func addButtons() {
        let size = CGSize(width: 50, height: 50)

        let undoButton = CanvasButton(size: size, image: UIImage(systemName: "arrowshape.turn.up.left.fill")) { [weak self] _ in
            self?.viewModel.undo()
        }

        let colorPicker = CanvasButton(size: size, image: UIImage(systemName: "paintpalette.fill")) { [weak self] _ in
            guard let self else { return }
            picker.delegate = self
            present(picker, animated: true)
        }

        let trashCanButton = CanvasButton(size: size, image: UIImage(systemName: "trash")) { [weak self] _ in
            self?.viewModel.undoAll()
        }

        let uploadViewButton = CanvasButton(size: size, image: UIImage(systemName: "envelope")) { _ in // [weak self]
            print("upload not implemented")
        }

        let toggleDrawing = CanvasButton(size: size, image: UIImage(systemName: "pencil.circle")) { [weak self] _ in
            self?.viewModel.isDrawable.toggle()
        }

        let stackView = UIStackView(arrangedSubviews: [
            undoButton,
            trashCanButton,
            colorPicker,
            uploadViewButton,
            toggleDrawing
        ])
        view.addSubview(stackView)

        stackView.axis = .vertical
        stackView.bringSubviewToFront(view)
        stackView.spacing = 30
        stackView.snp.makeConstraints { make in
            make.right.equalTo(view.snp_rightMargin).offset(-20)
            make.top.equalTo(view.snp_topMargin)
        }
        canvas.frame = view.frame
    }
}

// MARK: - UIColorPickerViewControllerDelegate

extension VideoEditorViewController: UIColorPickerViewControllerDelegate {
    func colorPickerViewController(_ viewController: UIColorPickerViewController, didSelect color: UIColor, continuously: Bool) {
        viewModel.strokeColor = picker.selectedColor
    }
}
  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

class VideoEditorViewController: UIViewController {
    private let viewModel = VideoEditorViewModel()
    private let canvas = CanvasView()
    private let picker = UIColorPickerViewController()

    override func viewDidLoad() {
        super.viewDidLoad()
        setCanvasUI()
    }
}

// MARK: - Private interface

private extension VideoEditorViewController {
    func setCanvasUI() {
        view.addSubview(canvas)
        canvas.backgroundColor = .clear

        bindViewModelToUpdateCanvas()
        bindCanvasToUpdateViewModel()
        addButtons()
    }

    func bindViewModelToUpdateCanvas() {
        viewModel.onUpdateDrawing = { [weak self] lines in
            self?.canvas.lines = lines
        }
    }

    func bindCanvasToUpdateViewModel() {
        canvas.onStartGesture = { [weak self] point in
            self?.viewModel.startNewLine(at: point)
        }

        canvas.onContinueGesture = { [weak self] points in
            self?.viewModel.appendToLastLine(points)
        }
    }

    func addButtons() {
        let size = CGSize(width: 50, height: 50)

        let undoButton = CanvasButton(size: size, image: UIImage(systemName: "arrowshape.turn.up.left.fill")) { [weak self] _ in
            self?.viewModel.undo()
        }

        let colorPicker = CanvasButton(size: size, image: UIImage(systemName: "paintpalette.fill")) { [weak self] _ in
            guard let self else { return }
            picker.delegate = self
            present(picker, animated: true)
        }

        let trashCanButton = CanvasButton(size: size, image: UIImage(systemName: "trash")) { [weak self] _ in
            self?.viewModel.undoAll()
        }

        let uploadViewButton = CanvasButton(size: size, image: UIImage(systemName: "envelope")) { _ in // [weak self]
            print("upload not implemented")
        }

        let toggleDrawing = CanvasButton(size: size, image: UIImage(systemName: "pencil.circle")) { [weak self] _ in
            self?.viewModel.isDrawable.toggle()
        }

        let stackView = UIStackView(arrangedSubviews: [
            undoButton,
            trashCanButton,
            colorPicker,
            uploadViewButton,
            toggleDrawing
        ])
        view.addSubview(stackView)

        stackView.axis = .vertical
        stackView.bringSubviewToFront(view)
        stackView.spacing = 30
        stackView.snp.makeConstraints { make in
            make.right.equalTo(view.snp_rightMargin).offset(-20)
            make.top.equalTo(view.snp_topMargin)
        }
        canvas.frame = view.bounds
    }
}

// MARK: - UIColorPickerViewControllerDelegate

extension VideoEditorViewController: UIColorPickerViewControllerDelegate {
    func colorPickerViewController(_ viewController: UIColorPickerViewController, didSelect color: UIColor, continuously: Bool) {
        viewModel.strokeColor = picker.selectedColor
    }
}
  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Likewise, when setting the frame of canvas, you should use the coordinate system of view. Specifically, you should reference view.bounds, not view.frame; the latter is in the coordinate system of the superview of view. In this case, there is not an observable difference, but if this view was not at (0,0), you will start to manifest problems.

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

Whoops, paragraph at end was attached to the wrong bullet point
Source Link
Rob
  • 2.7k
  • 17
  • 27
  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

tweak `startNewLine` signature
Source Link
Rob
  • 2.7k
  • 17
  • 27
class VideoEditorViewController: UIViewController {
    private let viewModel = VideoEditorViewModel()
    private let canvas = CanvasView()
    private let picker = UIColorPickerViewController()

    override func viewDidLoad() {
        super.viewDidLoad()
        setCanvasUI()
    }
}

// MARK: - Private interface

private extension VideoEditorViewController {
    func setCanvasUI() {
        view.addSubview(canvas)
        canvas.backgroundColor = .clear

        bindViewModelToUpdateCanvas()
        bindCanvasToUpdateViewModel()
        addButtons()
    }

    func bindViewModelToUpdateCanvas() {
        viewModel.onUpdateDrawing = { [weak self] lines in
            self?.canvas.lines = lines
        }
    }

    func bindCanvasToUpdateViewModel() {
        canvas.onStartGesture = { [weak self] point in
            self?.viewModel.startNewLine(pointat: point)
        }

        canvas.onContinueGesture = { [weak self] points in
            self?.viewModel.appendToLastLine(points)
        }
    }

    func addButtons() {
        let size = CGSize(width: 50, height: 50)

        let undoButton = CanvasButton(size: size, image: UIImage(systemName: "arrowshape.turn.up.left.fill")) { [weak self] _ in
            self?.viewModel.undo()
        }

        let colorPicker = CanvasButton(size: size, image: UIImage(systemName: "paintpalette.fill")) { [weak self] _ in
            guard let self else { return }
            picker.delegate = self
            present(picker, animated: true)
        }

        let trashCanButton = CanvasButton(size: size, image: UIImage(systemName: "trash")) { [weak self] _ in
            self?.viewModel.undoAll()
        }

        let uploadViewButton = CanvasButton(size: size, image: UIImage(systemName: "envelope")) { _ in // [weak self]
            print("upload not implemented")
        }

        let toggleDrawing = CanvasButton(size: size, image: UIImage(systemName: "pencil.circle")) { [weak self] _ in
            self?.viewModel.isDrawable.toggle()
        }

        let stackView = UIStackView(arrangedSubviews: [
            undoButton,
            trashCanButton,
            colorPicker,
            uploadViewButton,
            toggleDrawing
        ])
        view.addSubview(stackView)

        stackView.axis = .vertical
        stackView.bringSubviewToFront(view)
        stackView.spacing = 30
        stackView.snp.makeConstraints { make in
            make.right.equalTo(view.snp_rightMargin).offset(-20)
            make.top.equalTo(view.snp_topMargin)
        }
        canvas.frame = view.frame
    }
}

// MARK: - UIColorPickerViewControllerDelegate

extension VideoEditorViewController: UIColorPickerViewControllerDelegate {
    func colorPickerViewController(_ viewController: UIColorPickerViewController, didSelect color: UIColor, continuously: Bool) {
        viewModel.strokeColor = picker.selectedColor
    }
}
class VideoEditorViewModel {
    var strokeColor: UIColor = .blue
    var strokeWidth: Float = 10
    var isDrawable: Bool = true
    private(set) var lines: [Line] = [] { didSet { onUpdateDrawing?(lines) } }
    var onUpdateDrawing: (([Line]) -> Void)?
}

extension VideoEditorViewModel {
    @discardableResult
    func undo() -> Line? {
        lines.popLast()
    }

    func undoAll() {
        lines.removeAll()
    }

    func startNewLine(at point: CGPoint?) {
        let line = Line(
            strokeWidth: strokeWidth,
            color: strokeColor,
            points: [point].compactMap { $0 },
            drawable: isDrawable
        )
        lines.append(line)
    }

    func appendToLastLine(_ points: [CGPoint]?) {
        guard let points else { return }

        let index = lines.count - 1
        if index >= 0 {
            lines[index].points.append(contentsOf: points)
        }
    }
}
  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

class VideoEditorViewController: UIViewController {
    private let viewModel = VideoEditorViewModel()
    private let canvas = CanvasView()
    private let picker = UIColorPickerViewController()

    override func viewDidLoad() {
        super.viewDidLoad()
        setCanvasUI()
    }
}

// MARK: - Private interface

private extension VideoEditorViewController {
    func setCanvasUI() {
        view.addSubview(canvas)
        canvas.backgroundColor = .clear

        bindViewModelToUpdateCanvas()
        bindCanvasToUpdateViewModel()
        addButtons()
    }

    func bindViewModelToUpdateCanvas() {
        viewModel.onUpdateDrawing = { [weak self] lines in
            self?.canvas.lines = lines
        }
    }

    func bindCanvasToUpdateViewModel() {
        canvas.onStartGesture = { [weak self] point in
            self?.viewModel.startNewLine(point: point)
        }

        canvas.onContinueGesture = { [weak self] points in
            self?.viewModel.appendToLastLine(points)
        }
    }

    func addButtons() {
        let size = CGSize(width: 50, height: 50)

        let undoButton = CanvasButton(size: size, image: UIImage(systemName: "arrowshape.turn.up.left.fill")) { [weak self] _ in
            self?.viewModel.undo()
        }

        let colorPicker = CanvasButton(size: size, image: UIImage(systemName: "paintpalette.fill")) { [weak self] _ in
            guard let self else { return }
            picker.delegate = self
            present(picker, animated: true)
        }

        let trashCanButton = CanvasButton(size: size, image: UIImage(systemName: "trash")) { [weak self] _ in
            self?.viewModel.undoAll()
        }

        let uploadViewButton = CanvasButton(size: size, image: UIImage(systemName: "envelope")) { _ in // [weak self]
            print("upload not implemented")
        }

        let toggleDrawing = CanvasButton(size: size, image: UIImage(systemName: "pencil.circle")) { [weak self] _ in
            self?.viewModel.isDrawable.toggle()
        }

        let stackView = UIStackView(arrangedSubviews: [
            undoButton,
            trashCanButton,
            colorPicker,
            uploadViewButton,
            toggleDrawing
        ])
        view.addSubview(stackView)

        stackView.axis = .vertical
        stackView.bringSubviewToFront(view)
        stackView.spacing = 30
        stackView.snp.makeConstraints { make in
            make.right.equalTo(view.snp_rightMargin).offset(-20)
            make.top.equalTo(view.snp_topMargin)
        }
        canvas.frame = view.frame
    }
}

// MARK: - UIColorPickerViewControllerDelegate

extension VideoEditorViewController: UIColorPickerViewControllerDelegate {
    func colorPickerViewController(_ viewController: UIColorPickerViewController, didSelect color: UIColor, continuously: Bool) {
        viewModel.strokeColor = picker.selectedColor
    }
}
class VideoEditorViewModel {
    var strokeColor: UIColor = .blue
    var strokeWidth: Float = 10
    var isDrawable: Bool = true
    private(set) var lines: [Line] = [] { didSet { onUpdateDrawing?(lines) } }
    var onUpdateDrawing: (([Line]) -> Void)?
}

extension VideoEditorViewModel {
    @discardableResult
    func undo() -> Line? {
        lines.popLast()
    }

    func undoAll() {
        lines.removeAll()
    }

    func startNewLine(point: CGPoint?) {
        let line = Line(
            strokeWidth: strokeWidth,
            color: strokeColor,
            points: [point].compactMap { $0 },
            drawable: isDrawable
        )
        lines.append(line)
    }

    func appendToLastLine(_ points: [CGPoint]?) {
        guard let points else { return }

        let index = lines.count - 1
        if index >= 0 {
            lines[index].points.append(contentsOf: points)
        }
    }
}
  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

class VideoEditorViewController: UIViewController {
    private let viewModel = VideoEditorViewModel()
    private let canvas = CanvasView()
    private let picker = UIColorPickerViewController()

    override func viewDidLoad() {
        super.viewDidLoad()
        setCanvasUI()
    }
}

// MARK: - Private interface

private extension VideoEditorViewController {
    func setCanvasUI() {
        view.addSubview(canvas)
        canvas.backgroundColor = .clear

        bindViewModelToUpdateCanvas()
        bindCanvasToUpdateViewModel()
        addButtons()
    }

    func bindViewModelToUpdateCanvas() {
        viewModel.onUpdateDrawing = { [weak self] lines in
            self?.canvas.lines = lines
        }
    }

    func bindCanvasToUpdateViewModel() {
        canvas.onStartGesture = { [weak self] point in
            self?.viewModel.startNewLine(at: point)
        }

        canvas.onContinueGesture = { [weak self] points in
            self?.viewModel.appendToLastLine(points)
        }
    }

    func addButtons() {
        let size = CGSize(width: 50, height: 50)

        let undoButton = CanvasButton(size: size, image: UIImage(systemName: "arrowshape.turn.up.left.fill")) { [weak self] _ in
            self?.viewModel.undo()
        }

        let colorPicker = CanvasButton(size: size, image: UIImage(systemName: "paintpalette.fill")) { [weak self] _ in
            guard let self else { return }
            picker.delegate = self
            present(picker, animated: true)
        }

        let trashCanButton = CanvasButton(size: size, image: UIImage(systemName: "trash")) { [weak self] _ in
            self?.viewModel.undoAll()
        }

        let uploadViewButton = CanvasButton(size: size, image: UIImage(systemName: "envelope")) { _ in // [weak self]
            print("upload not implemented")
        }

        let toggleDrawing = CanvasButton(size: size, image: UIImage(systemName: "pencil.circle")) { [weak self] _ in
            self?.viewModel.isDrawable.toggle()
        }

        let stackView = UIStackView(arrangedSubviews: [
            undoButton,
            trashCanButton,
            colorPicker,
            uploadViewButton,
            toggleDrawing
        ])
        view.addSubview(stackView)

        stackView.axis = .vertical
        stackView.bringSubviewToFront(view)
        stackView.spacing = 30
        stackView.snp.makeConstraints { make in
            make.right.equalTo(view.snp_rightMargin).offset(-20)
            make.top.equalTo(view.snp_topMargin)
        }
        canvas.frame = view.frame
    }
}

// MARK: - UIColorPickerViewControllerDelegate

extension VideoEditorViewController: UIColorPickerViewControllerDelegate {
    func colorPickerViewController(_ viewController: UIColorPickerViewController, didSelect color: UIColor, continuously: Bool) {
        viewModel.strokeColor = picker.selectedColor
    }
}
class VideoEditorViewModel {
    var strokeColor: UIColor = .blue
    var strokeWidth: Float = 10
    var isDrawable: Bool = true
    private(set) var lines: [Line] = [] { didSet { onUpdateDrawing?(lines) } }
    var onUpdateDrawing: (([Line]) -> Void)?
}

extension VideoEditorViewModel {
    @discardableResult
    func undo() -> Line? {
        lines.popLast()
    }

    func undoAll() {
        lines.removeAll()
    }

    func startNewLine(at point: CGPoint?) {
        let line = Line(
            strokeWidth: strokeWidth,
            color: strokeColor,
            points: [point].compactMap { $0 },
            drawable: isDrawable
        )
        lines.append(line)
    }

    func appendToLastLine(_ points: [CGPoint]?) {
        guard let points else { return }

        let index = lines.count - 1
        if index >= 0 {
            lines[index].points.append(contentsOf: points)
        }
    }
}
  • You are constructing the stack view in viewDidAppear. Each time it appears, you are going to create a new stack view. This should be done only once, in viewDidLoad.

  • The color picker is presenting the picker and as soon as that presentation is complete, you are setting the canvas color. This should be moved into the UIColorPickerViewControllerDelegate method colorPickerViewController(_:didSelect:continuously:).

  • You are not using UIImagePickerControllerDelegate, so that conformance declaration (esp where you do not implement any of these methods) is unnecessary.

  • In touchesBegan, you really should capture where the gesture started. Right now, you are discarding the first point of the gesture.

  • Where you are extracting the position from Set<UITouch>, you really should specify the view as self, otherwise you will be getting coordinates in the window’s coordinate system, not your current view’s coordinate system. (You might not notice if the view is located at (0, 0), but if it was not, you would not get the coordinates you expected.)

  • Where extracting updates in gestures, you should probably consider coalesced touches, for more fine-grained details on touch information (on capable hardware). It’s not relevant to the broader MVVM questions, but a consideration when processing touches.

  • Personally, I would also capture predictive touches to reduce lag in the UI (but you need a slightly more refined model to capture this dimension). See https://stackoverflow.com/a/64709364/1271826 or https://stackoverflow.com/a/58456179/1271826.

  • When rendering gestures, I would also be inclined to smooth them via Catmull Rom or Hermite splines. See https://stackoverflow.com/a/34583708/1271826. Again, unrelated to the broader MVVM question, so I will leave that to the reader.

  • I must confess that it offends my MVVM-purist tendencies that a model object (like Line) is still entangled with UIKit. This is because of the UIColor property. I might be inclined to abstract this to some non-platform-specific structure, but, again, that is beyond the scope of this question.

  • I did not grok the purpose of the drawable property of Line. I attempted to preserve that in my example, but I am not sure what the utility of capturing a Line that was not drawable. I interpreted this as a path that was not to be rendered, but that doesn’t make sense IMHO, so I apologize if misunderstood your intent. But, nonetheless, I attempted to preserve the idea. It is not quite relevant to the broader question, but FWIW.

    But as a general design principle, many of us avoid entangling model objects with platform-specific types like this. FWIW.

what is the purpose of a non-drawable line? i dunno, so add disclaimer.
Source Link
Rob
  • 2.7k
  • 17
  • 27
Loading
Typos
Source Link
Rob
  • 2.7k
  • 17
  • 27
Loading
Source Link
Rob
  • 2.7k
  • 17
  • 27
Loading