0

We have an application that has grown over the years and was originally designed pretty monolithically. In the recent months, we have broken the application into smaller modules, which made it a lot more maintainable.

However, the main module in our code still declares dependencies to pretty much any other module that is used anywhere in our codebase. This is ridiculous, because the main module doesn't actually have those dependencies, the modules we pulled out of it do.

The application still runs fine though, since every module is dependent upon somewhere and thus it is loaded and available.

When we now pull out a module into another codebase, it usually doesn't work, because the dependencies aren't correct (as those were declared on the core module in the original application).

How would I determine if an individual module correctly declared all of its dependencies?

2
  • 2
    Other than run it and see if it fails? not sure. Commented Apr 29, 2015 at 19:23
  • Would suggest potentially writing up some static code analysis so you can remove the direct dependencies in the main app where those dependencies are used in some dependency of the main app to reduce the number of things loaded by the main module. This will at least reduce the number of dependencies to check through when trying to pull out a module, and the script/program could be run periodically to help further reduce the list of dependencies on the main app. Otherwise agree with Kevin B. The original problem is difficult without having fixed Types/Classes that you can look for. Commented Apr 29, 2015 at 19:55

1 Answer 1

1

How about writing a unit test with mocks for the injected dependencies. Calls to non mocked objects should trigger a failure.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.