I looked around the 'net to find the definition of broadband signal. I found that the term does not have any specific meanings -- e.g., there is no limit point such as (to make up a figure) 8.46 megahertz above which the signal is considered "broadband".
The definitions I found were in terms of whether the signal exceeded the bandwidth of whatever filter was in place.
With that definition, I would be forced to say that unless one was given external information about the filter being used, that any signal defined by its samples and signal sampled to a finite time would have to be considered to be sampled to the Nyquist Frequency, that the Nyquist Frequency could be considered the filter bandwidth, and thus that unless given that external information, any signal defined by its samples and sampled to a finite time would, by definition, be not a broadband signal.
A statement that a signal was periodic would imply that there is a "true" signal beyond what was sampled and so that the signal was not defined by its samples and so might potentially be broadband. But not every signal defined as periodic but finitely sampled is broadband; for example a sine wave is not broadband.
Knowing that signals are pulses or square-waves or triangle waves are examples of signals for which there is a "true" signal beyond what was sampled (they require infinite bandwidth), and yet clearly such signals need not be periodic.
So again: if you are given a filter bandwidth, or if you are given a knowledge that a signal has information beyond what was sampled, then you might be dealing with a broadband signal. But if the signal is defined by a finite list of samples (such as your signal is) then the definition of "broadband" appears to me to indicate that such a signal could never be considered broadband.