This is one of those important foundational questions that leads to proper work down the right path, if answered correctly. Here’s to hoping I can do that for you, here!
First, let’s define “general AI”. I’ve provided my re-definition here, which will help frame the question:
Artificial General Intelligence: A (Re-)Definition
In the context of that definition, we’re not constrained to mapping the human brain. In fact, that becomes a strong limitation to solving AGI. We should look more broadly at all animal brains as special cases of a general solution. That general solution, in turn, can provide special case solutions based on the applications we need solved.
What should we map from those different animal brain samples? Specifically, it is important to map their connectome. Here’s what that means for humans, from the Human Connectome Project: Mapping the human brain connectivity
These maps are important because they provide not just clues, but exact information processing pathways that animal brains use to accomplish things like understanding language by connecting visual and auditory regions of the brain together.
So what is the “mathematical” component involved? Here, the answer for me is clear: information theory. These connectome pathways are not simply chemical or electrical signal pathways. They are information processing pathways. The chemical and electrical nature of it is simply an implementation of the general information processing function that needs to be mathematically modeled.
The function here is singular, not plural. There is only one information processing function that is needed. It is likely to me that all biological brains utilize this same, singular, information processing function, even if the mechanisms they use to do it are different from organism to organism. (Unlikely that they would be very different, anyway.) For implementation purposes, this is extremely convenient since there is no need to search for just-the-right algorithm for a new use case.
Let’s address one last component of your question, which is if this knowledge of the connectome map is a requirement to get AGI to reach human-level intelligence (as per the re-definition described in the link provided previously). No, I don’t think it is a requirement. With a properly constructed framework, finding working solutions could be a simple matter of directed trial-and-error using, for example, genetic algorithms. This is how nature took single-celled organisms and built human-level intelligent creatures. However, we can accelerate that directed search with knowledge gained from such studies.