I am trying to write a function that has a sympy symbol with a default value, but am not getting what I expect.
Here is my code
import sympy as sy
def addtwo(y = sy.symbols('y', reals = True)):
return y + 2
x = addtwo()
print(type(x))
print(x)
print(x.subs(y,1))
with output
<class 'sympy.core.add.Add'>
y + 2
y + 2
However, the code
y = sy.symbols('y')
x = addtwo(y)
print(x)
print(x.subs(y,1))
gives the expected output
y + 2
3